Skip to main content

Home/ CTLT and Friends/ Group items tagged tools

Rss Feed Group items tagged

Joshua Yeidel

Video: Can Web Tools Replace Blackboard? - Chronicle.com - 0 views

  •  
    In this 4-minute video, Jim Groom articulates the "LMS is out-of-date" argument, and suggests that the future of online education depends on re-imagining the "form". He points to issues like openness of code and owndership of data.
Nils Peterson

The End in Mind - 0 views

shared by Nils Peterson on 31 Jul 09 - Cached
  • A rapidly growing number of people are creating their own personal learning environments with tools freely available to them, without the benefit of a CMS. As Christensen would say, they have hired different technologies to do the job of a CMS for them. But the technologies they’re hiring are more flexible, accessible and learner-centered than today’s CMSs. This is not to say that CMSs are about to disappear. Students enrolled in institutions of higher learning will certainly continue to participate in CMS-delivered course sites, but since these do not generally persist over time, the really valuable learning technologies will increasily be in the cloud.
    • Nils Peterson
       
      Jon Mott thinking about the Bb World, CMSes in general and Innovator's Dilemma.
  • Both administration and pedagogy are necessary in schools. They are also completely different in what infrastructure they require. This (in my opinion) has been the great failing of VLEs – they all try to squeeze the round pedagogy peg into the square administration hole. It hasn’t worked very well. Trying to coax collaboration in what is effectively an administrative environment, without the porous walls that social media thrives on, hasn’t worked. The ‘walled garden’ of the VLE is just not as fertile as the juicy jungle outside, and not enough seeds blow in on the wind.
Nils Peterson

The Edurati Review: Can "The Least Of Us" Disrupt and Change Education for "The Rest Of... - 1 views

  • Internet access brings knowledge and information to the poor around the world. The reality is that a poor person is more likely to gain access to the Internet and the world of knowledge and information that it brings, than he or she is to get well-trained teacher in school.Disruption will come when the poor of the world figure out ways to educate themselves and their neighbors via the Internet. Of course this education won’t match the focus, rigor, and quality of Western schools, but never the less, the drive and need to learn will create a youth movement in these developing countries for using the Internet as a tool to educate themselves and others.And if all one has is the Internet, one is eventually going to get very good at using it to meet their needs. He or she will develop methods and practices that seem strange, different, and unorthodox. They will rely on the Internet as a source of education.Some in the West might begin to look at these poor kids in developing countries teaching themselves and their neighbors without classrooms and without teachers. Some might begin to wonder and ask, "If it works for them, might it work for us?" Some might adopt some of these strange, different, and unorthodox practices.
    • Nils Peterson
       
      Specualtion on the source of "disruption" to education. Might fit Clayton Christiansen's definition, and the author speculates it would be powered by youth, following previous youth movements.
Gary Brown

It's the Learning, Stupid - Lumina Foundation: Helping People Achieve Their Potential - 3 views

  • My thesis is this. We live in a world where much is changing, quickly. Economic crises, technology, ideological division, and a host of other factors have all had a profound influence on who we are and what we do in higher education. But when all is said and done, it is imperative that we not lose sight of what matters most. To paraphrase the oft-used maxim of the famous political consultant James Carville, it's the learning, stupid.
  • We believe that, to significantly increase higher education attainment rates, three intermediate outcomes must first occur: Higher education must use proven strategies to move students to completion. Quality data must be used to improve student performance and inform policy and decision-making at all levels. The outcomes of student learning must be defined, measured, and aligned with workforce needs. To achieve these outcomes (and thus improve success rates), Lumina has decided to pursue several specific strategies. I'll cite just a few of these many different strategies: We will advocate for the redesign, rebranding and improvement of developmental education. We will explore the development of alternative pathways to degrees and credentials. We will push for smoother systems of transferring credit so students can move more easily between institutions, including from community colleges to bachelor's degree programs.
  • "Lumina defines high-quality credentials as degrees and certificates that have well-defined and transparent learning outcomes which provide clear pathways to further education and employment."
  • ...4 more annotations...
  • And—as Footnote One softly but incessantly reminds us—quality, at its core, must be a measure of what students actually learn and are able to do with the knowledge and skills they gain.
  • and yet we seem reluctant or unable to discuss higher education's true purpose: equipping students for success in life.
  • Research has already shown that higher education institutions vary significantly in the value they add to students in terms of what those students actually learn. Various tools and instruments tell us that some institutions add much more value than others, even when looking at students with similar backgrounds and abilities.
  • The idea with tuning is to take various programs within a specific discipline—chemistry, history, psychology, whatever—and agree on a set of learning outcomes that a degree in the field represents. The goal is not for the various programs to teach exactly the same thing in the same way or even for all of the programs to offer the same courses. Rather, programs can employ whatever techniques they prefer, so long as their students can demonstrate mastery of an agreed-upon body of knowledge and set of skills. To use the musical terminology, the various programs are not expected to play the same notes, but to be "tuned" to the same key.
Nils Peterson

An Expert Surveys the Assessment Landscape - Student Affairs - The Chronicle of Higher ... - 2 views

  • Colleges and universities have plenty of tools, but they must learn to use them more effectively. That is how George D. Kuh describes the state of assessing what college students learn.
Gary Brown

News: Assessing the Assessments - Inside Higher Ed - 2 views

  • The validity of a measure is based on evidence regarding the inferences and assumptions that are intended to be made and the uses to which the measure will be put. Showing that the three tests in question are comparable does not support Shulenburger's assertion regarding the value-added measure as a valid indicator of institutional effectiveness. The claim that public university groups have previously judged the value-added measure as appropriate does not tell us anything about the evidence upon which this judgment was based nor the conditions under which the judgment was reached. As someone familiar with the process, I would assert that there was no compelling evidence presented that these instruments and the value-added measure were validated for making this assertion (no such evidence was available at the time), which is the intended use in the VSA.
  • (however much the sellers of these tests tell you that those samples are "representative"), they provide an easy way out for academic administrators who want to avoid the time-and-effort consuming but incredibly valuable task of developing detailed major program learning outcome statements (even the specialized accrediting bodies don't get down to the level of discrete, operational statements that guide faculty toward appropriate assessment design)
  • f somebody really cared about "value added," they could look at each student's first essay in this course, and compare it with that same student's last essay in this course. This person could then evaluate each individual student's increased mastery of the subject-matter in the course (there's a lot) and also the increased writing skill, if any.
  • ...1 more annotation...
  • These skills cannot be separated out from student success in learning sophisticated subject-matter, because understanding anthropology, or history of science, or organic chemistry, or Japanese painting, is not a matter of absorbing individual facts, but learning facts and ways of thinking about them in a seamless, synthetic way. No assessment scheme that neglects these obvious facts about higher education is going to do anybody any good, and we'll be wasting valuable intellectual and financial resources if we try to design one.
  •  
    ongoing discussion of these tools. Note Longanecker's comment and ask me why.
Nils Peterson

From SMCEDU: 5 Steps to Make the Social Web Work for Higher Ed - 0 views

  • At a kickoff event tonight in Richmond, Virginia, I got to participate in a panel discussion and hear questions from an audience of college students and professors. One of the questions posed was how those in academia can best put the social web to work for themselves. Far beyond Facebook and LinkedIn, how can this community harness the Internet to be smarter, more efficient, and more productive? Read on for our top five ideas.
    • Nils Peterson
       
      The 5 steps 1. Find your network, they say Twitter is a good way to do this 2. Keep up, they say RSS of the blogs of key players you found 3. Create your identity, get beyound the one you have with Facebook and consider yourname.com 4. Contribute content to the conversation, start a blog or website 5. Continue to explore and adopt new tools
Gary Brown

Schmidt - 3 views

  • There are a number of assessment methods by which learning can be evaluated (exam, practicum, etc.) for the purpose of recognition and accreditation, and there are a number of different purposes for the accreditation itself (i.e., job, social recognition, membership in a group, etc). As our world moves from an industrial to a knowledge society, new skills are needed. Social web technologies offer opportunities for learning, which build these skills and allow new ways to assess them.
  • This paper makes the case for a peer-based method of assessment and recognition as a feasible option for accreditation purposes. The peer-based method would leverage online communities and tools, for example digital portfolios, digital trails, and aggregations of individual opinions and ratings into a reliable assessment of quality. Recognition by peers can have a similar function as formal accreditation, and pathways to turn peer recognition into formal credits are outlined. The authors conclude by presenting an open education assessment and accreditation scenario, which draws upon the attributes of open source software communities: trust, relevance, scalability, and transparency.
  •  
    Kinship here, and familiar friends.
Gary Brown

At Colleges, Assessment Satisfies Only Accreditors - Letters to the Editor - The Chroni... - 2 views

  • Some of that is due to the influence of the traditional academic freedom that faculty members have enjoyed. Some of it is ego. And some of it is lack of understanding of how it can work. There is also a huge disconnect between satisfying outside parties, like accreditors and the government, and using assessment as a quality-improvement system.
  • We are driven by regional accreditation and program-level accreditation, not by quality improvement. At our institution, we talk about assessment a lot, and do just enough to satisfy the requirements of our outside reviewers.
  • Standardized direct measures, like the Major Field Test for M.B.A. graduates?
  • ...5 more annotations...
  • The problem with the test is that it does not directly align with our program's learning outcomes and it does not yield useful information for closing the loop. So why do we use it? Because it is accepted by accreditors as a direct measure and it is less expensive and time-consuming than more useful tools.
  • Without exception, the most useful information for improving the program and student learning comes from the anecdotal and indirect information.
  • We don't have the time and the resources to do what we really want to do to continuously improve the quality of our programs and instruction. We don't have a culture of continuous improvement. We don't make changes on a regular basis, because we are trapped by the catalog publishing cycle, accreditation visits, and the entrenched misunderstanding of the purposes of assessment.
  • The institutions that use it are ones that have adequate resources to do so. The time necessary for training, whole-system involvement, and developing the programs for improvement is daunting. And it is only being used by one regional accrediting body, as far as I know.
  • Until higher education as a whole is willing to look at changing its approach to assessment, I don't think it will happen
  •  
    The challenge and another piece of evidence that the nuances of assessment as it related to teaching and learning remain elusive.
« First ‹ Previous 81 - 89 of 89
Showing 20 items per page