Skip to main content

Home/ beyondwebct/ Group items tagged repository

Rss Feed Group items tagged

Barbara Lindsey

http://www.charlesleadbeater.net/cms/xstandard/LearningfromExtremes_WhitePaper.pdf - 0 views

  •  
    A suggested reading by Diego Leal in response to a blog comment by Claudio Pinna on our Learning Repositories post
Barbara Lindsey

Dr. Mashup; or, Why Educators Should Learn to Stop Worrying and Love the Remix | EDUCAU... - 0 views

  • A classroom portal that presents automatically updated syndicated resources from the campus library, news sources, student events, weblogs, and podcasts and that was built quickly using free tools.
  • Increasingly, it's not just works of art that are appropriated and remixed but the functionalities of online applications as well.
  • mashups involve the reuse, or remixing, of works of art, of content, and/or of data for purposes that usually were not intended or even imagined by the original creators.
  • ...31 more annotations...
  • hat, exactly, constitutes a valid, original work? What are the implications for how we assess and reward creativity? Can a college or university tap the same sources of innovative talent and energy as Google or Flickr? What are the risks of permitting or opening up to this activity?
    • Barbara Lindsey
       
      Good discussion point
  • Remix is the reworking or adaptation of an existing work. The remix may be subtle, or it may completely redefine how the work comes across. It may add elements from other works, but generally efforts are focused on creating an alternate version of the original. A mashup, on the other hand, involves the combination of two or more works that may be very different from one another. In this article, I will apply these terms both to content remixes and mashups, which originated as a music form but now could describe the mixing of any number of digital media sources, and to data mashups, which combine the data and functionalities of two or more Web applications.
  • Harper's article "The Ecstasy of Influence," the novelist Jonathan Lethem imaginatively reviews the history of appropriation and recasts it as essential to the act of creation.3
  • Lethem's article is a must-read for anyone with an interest in the history of ideas, creativity, and intellectual property. It brilliantly synthesizes multiple disciplines and perspectives into a wonderfully readable and compelling argument. It is also, as the subtitle of his article acknowledges, "a plagiarism." Virtually every passage is a direct lift from another source, as the author explains in his "Key," which gives the source for every line he "stole, warped, and cobbled together." (He also revised "nearly every sentence" at least slightly.) Lethem's ideas noted in the paragraph above were appropriated from Siva Vaidhyanathan, Craig Baldwin, Richard Posner, and George L. Dillon.
  • Reading Walter Benjamin's highly influential 1936 essay "The Work of Art in the Age of Mechanical Reproduction,"4 it's clear that the profound effects of reproductive technology were obvious at that time. As Gould argued in 1964 (influenced by theorists such as Marshall McLuhan5), changes in how art is produced, distributed, and consumed in the electronic age have deep effects on the character of the art itself.
  • Yet the technology developments of the past century have clearly corresponded with a new attitude toward the "aura" associated with a work of invention and with more aggressive attitudes toward appropriation. It's no mere coincidence that the rise of modernist genres using collage techniques and more fragmented structures accompanied the emergence of photography and audio recording.
  • Educational technologists may wonder if "remix" or "content mashup" are just hipper-sounding versions of the learning objects vision that has absorbed so much energy from so many talented people—with mostly disappointing results.
  • The question is, why should a culture of remix take hold when the learning object economy never did?
  • when most learning object repositories were floundering, resource-sharing services such as del.icio.us and Flickr were enjoying phenomenal growth, with their user communities eagerly contributing heaps of useful metadata via simple folksonomy-oriented tagging systems.
  • the standards/practices relationship implicit in the learning objects model has been reversed. With only the noblest of intentions, proponents of learning objects (and I was one of them) went at the problem of promoting reuse by establishing an arduous and complex set of interoperability standards and then working to persuade others to adopt those standards. Educators were asked to take on complex and ill-defined tasks in exchange for an uncertain payoff. Not surprisingly, almost all of them passed.
  • Discoverable Resources
  • Educators might justifiably argue that their materials are more authoritative, reliable, and instructionally sound than those found on the wider Web, but those materials are effectively rendered invisible and inaccessible if they are locked inside course management systems.
  • It's a dirty but open secret that many courses in private environments use copyrighted third-party materials in a way that pushes the limits of fair use—third-party IP is a big reason why many courses cannot easily be made open.
  • The potential payoff for using open and discoverable resources, open and transparent licensing, and open and remixable formats is huge: more reuse means that more dynamic content is being produced more economically, even if the reuse happens only within an organization. And when remixing happens in a social context on the open web, people learn from each other's process.
  • Part of making a resource reusable involves making the right choices for file formats.
  • To facilitate the remixing of materials, educators may want to consider making the source files that were used to create a piece of multimedia available along with the finished result.
  • In addition to choosing the right file format and perhaps offering the original sources, another issue to consider when publishing content online is the critical question: "Is there an RSS feed available?" If so, conversion tools such as Feed2JS (http://www.feed2JS.org) allow for the republication of RSS-ified content in any HTML Web environment, including a course management system, simply by copying and pasting a few lines of JavaScript code. When an original source syndicated with RSS is updated, that update is automatically rendered anywhere it has been republished.
  • Jack Schofield
  • Guardian Unlimited
  • "An API provides an interface and a set of rules that make it much easier to extract data from a website. It's a bit like a record company releasing the vocals, guitars and drums as separate tracks, so you would not have to use digital processing to extract the parts you wanted."1
  • What's new about mashed-up application development? In a sense, the factors that have promoted this approach are the same ones that have changed so much else about Web culture in recent years. Essential hardware and software has gotten more powerful and for the most part cheaper, while access to high-speed connectivity and the enhanced quality of online applications like Google Docs have improved to the point that Tim O'Reilly and others can talk of "the emergent Internet operating system."15 The growth of user-centered technologies such as blogs have fostered a DIY ("do it yourself") culture that increasingly sees online interaction as something that can be personalized and adapted on the individual level. As described earlier, light syndication and service models such as RSS have made it easier and faster than ever to create simple integrations of diverse media types. David Berlind, executive editor of ZDNet, explains: "With mashups, fewer technical skills are needed to become a developer than ever. Not only that, the simplest ones can be done in 10 or 15 minutes. Before, you had to be a pretty decent code jockey with languages like C++ or Visual Basic to turn your creativity into innovation. With mashups, much the same way blogging systems put Web publishing into the hands of millions of ordinary non-technical people, the barrier to developing applications and turning creativity into innovation is so low that there's a vacuum into which an entire new class of developers will be sucked."16
  • The ability to "clone" other users' mashups is especially exciting: a newcomer does not need to spend time learning how to structure the data flows but can simply copy an existing framework that looks useful and then make minor modifications to customize the result.19
    • Barbara Lindsey
       
      This is the idea behind the MIT repository--remixing content to suit local needs.
  • As with content remixing, open access to materials is not just a matter of some charitable impulse to share knowledge with the world; it is a core requirement for participating in some of the most exciting and innovative activity on the Web.
  • "My Maps" functionality
  • For those still wondering what the value proposition is for offering an open API, Google's development process offers a compelling example of the potential rewards.
    • Barbara Lindsey
       
      Wikinomics
  • Elsewhere, it is difficult to point to significant activity suggesting that the mashup ethos is taking hold in academia the way it is on the wider Web.
  • Yet for the most part, the notion of the data mashup and the required openness is not even a consideration in discussions of technology strategy in higher educational institutions. "Data integration" across campus systems is something that is handled by highly skilled professionals at highly skilled prices.
  • Revealing how a more adventurous and inclusive online development strategy might look on campus, Raymond Yee recently posted a comprehensive proposal for his university (UC Berkeley), in which he outlined a "technology platform" not unlike the one employed by Amazon.com (http://aws.amazon.com/)—resources and access that would be invaluable for the institution's programmers as well as for outside interests to build complementary services.
  • All too often, college and university administrators react to this type of innovation with suspicion and outright hostility rather than cooperation.
  • those of us in higher education who observe the successful practices in the wider Web world have an obligation to consider and discuss how we might apply these lessons in our own contexts. We might ask if the content we presently lock down could be made public with a license specifying reasonable terms for reuse. When choosing a content management system, we might consider how well it supports RSS syndication. In an excellent article in the March/April 2007 issue of EDUCAUSE Review, Joanne Berg, Lori Berquam, and Kathy Christoph listed a number of campus activities that could benefit from engaging social networking technologies.26
  • What might happen if we allow our campus innovators to integrate their practices in these areas in the same way that social networking application developers are already integrating theirs? What is the mission-critical data we cannot expose, and what can we expose with minimal risk? And if the notion of making data public seems too radical a step, can APIs be exposed to selected audiences, such as on-campus developers or consortia partners?
Barbara Lindsey

Create, Share and Learn with a Global Repository of Tests - 0 views

  •  
    fall 2011 syllabus
Barbara Lindsey

Powerhouse Museum to launch open access image repository - powerhouse museum, Gov 2.0 -... - 0 views

  • “Since then we have had two million views on 1700 images but for us it goes beyond the views; it is the connection we have made with this audience.”
  • According to Bray, the connection with audience has paid off with the Powerhouse’s community now volunteering to conduct research work that now adds to the museum’s knowledge of its own collection.
  • “They have been tagging, commenting, researching, identifying locations, doing incredible images because they are allowed to use them for free and with no restrictions,” Bray says. “It allows the audience to do citizen curation.”
  • ...2 more annotations...
  • “Our philosophy is not only about making our content accessible to the public, but getting to know our audience; starting conversations. Audiences now really want to get to know the person behind the organisation… they want to participate not just online but on site.”
  • the online archive, which will also grow to include some 50 per cent of all audio-visual content created by the Powerhouse Museum, was driven by Gov 2.0’s central premise of sharing information and engaging with citizens.
Barbara Lindsey

Print: The Chronicle: 6/15/2007: The New Metrics of Scholarly Authority - 0 views

    • Barbara Lindsey
       
      Higher ed slow to respond.
  • Web 2.0 is all about responding to abundance, which is a shift of profound significance.
  • Chefs simply couldn't exist in a world of universal scarcity
  • ...33 more annotations...
  • a time when scholarship, and how we make it available, will be affected by information abundance just as powerfully as food preparation has been.
  • Scholarly communication before the Internet required the intermediation of publishers. The costliness of publishing became an invisible constraint that drove nearly all of our decisions. It became the scholar's job to be a selector and interpreter of difficult-to-find primary and secondary sources; it was the scholarly publisher's job to identify the best scholars with the best perspective and the best access to scarce resources.
    • Barbara Lindsey
       
      Comments?
  • Online scholarly publishing in Web 1.0 mimicked those fundamental conceptions. The presumption was that information scarcity still ruled. Most content was closed to nonsubscribers; exceedingly high subscription costs for specialty journals were retained; libraries continued to be the primary market; and the "authoritative" version was untouched by comments from the uninitiated. Authority was measured in the same way it was in the scarcity world of paper: by number of citations to or quotations from a book or article, the quality of journals in which an article was published, the institutional affiliation of the author, etc.
  • Google
  • Google
    • Barbara Lindsey
       
      Where critical analysis comes in
  • The challenge for all those sites pertains to abundance:
  • Such systems have not been framed to confer authority, but as they devise means to deal with predators, scum, and weirdos wanting to be a "friend," they are likely to expand into "trust," or "value," or "vouching for my friend" metrics — something close to authority — in the coming years.
  • ecently some more "authoritative" editors have been given authority to override whining ax grinders.
  • In many respects Boing Boing is an old-school edited resource. It doesn't incorporate feedback or comments, but rather is a publication constructed by five editor-writers
  • As the online environment matures, most social spaces in many disciplines will have their own "boingboings."
  • They differ from current models mostly by their feasible computability in a digital environment where all elements can be weighted and measured, and where digital interconnections provide computable context.
  • In the very near future, if we're talking about a universe of hundreds of billions of documents, there will routinely be thousands, if not tens of thousands, if not hundreds of thousands, of documents that are very similar to any new document published on the Web. If you are writing a scholarly article about the trope of smallpox in Shakespearean drama, how do you ensure you'll be read? By competing in computability. Encourage your friends and colleagues to link to your online document. Encourage online back-and-forth with interested readers. Encourage free access to much or all of your scholarly work. Record and digitally archive all your scholarly activities. Recognize others' works via links, quotes, and other online tips of the hat. Take advantage of institutional repositories, as well as open-access publishers. The list could go on.
  • the new authority metrics, instead of relying on scholarly publishers to establish the importance of material for them.
  • They need to play a role in deciding not just what material will be made available online, but also how the public will be allowed to interact with the material. That requires a whole new mind-set.
  • cholarly publishers
  • Many of the values of scholarship are not well served yet by the Web: contemplation, abstract synthesis, construction of argument.
  • Traditional models of authority will probably hold sway in the scholarly arena for 10 to 15 years, while we work out the ways in which scholarly engagement and significance can be measured in new kinds of participatory spaces.
  • if scholarly output is locked away behind fire walls, or on hard drives, or in print only, it risks becoming invisible to the automated Web crawlers, indexers, and authority-interpreters that are being developed. Scholarly invisibility is rarely the path to scholarly authority.
  • Web 1.0,
  • garbed new business and publishing models in 20th-century clothes.
  • fundamental presumption is one of endless information abundance.
  • Flickr, YouTube
  • micromarkets
  • multiple demographics
  • Abundance leads to immediate context and fact checking, which changes the "authority market" substantially. The ability to participate in most online experiencesvia comments, votes, or ratingsis now presumed, and when it's not available, it's missed.
  • Google interprets a link from Page A to Page B as a vote, by Page A, for Page B. But, Google looks at more than the sheer volume of votes, or links a page receives; for example, it also analyzes the page that casts the vote. Votes cast by pages that are themselves 'important' weigh more heavily and help to make other pages 'important,'"
  • It has its limits, but it also both confers and confirms authority because people tend to point to authoritative sources to bolster their own work.
  • That kind of democratization of authority is nearly unique to wikis that are group edited, since not observation, but active participation in improvement, is the authority metric.
  • user-generated authority, many of which are based on algorithmic analysis of participatory engagement. The emphasis in such models is often not on finding scarce value, but on weeding abundance
  • Authority 3.0 will probably include (the list is long, which itself is a sign of how sophisticated our new authority makers will have to be): Prestige of the publisher (if any). Prestige of peer prereviewers (if any). Prestige of commenters and other participants. Percentage of a document quoted in other documents. Raw links to the document. Valued links, in which the values of the linker and all his or her other links are also considered. Obvious attention: discussions in blogspace, comments in posts, reclarification, and continued discussion. Nature of the language in comments: positive, negative, interconnective, expanded, clarified, reinterpreted. Quality of the context: What else is on the site that holds the document, and what's its authority status? Percentage of phrases that are valued by a disciplinary community. Quality of author's institutional affiliation(s). Significance of author's other work. Amount of author's participation in other valued projects, as commenter, editor, etc. Reference network: the significance rating of all the texts the author has touched, viewed, read. Length of time a document has existed. Inclusion of a document in lists of "best of," in syllabi, indexes, and other human-selected distillations. Types of tags assigned to it, the terms used, the authority of the taggers, the authority of the tagging system.
  • Most technophile thinkers out there believe that Web 3.0 will be driven by artificial intelligences — automated computer-assisted systems that can make reasonable decisions on their own, to preselect, precluster, and prepare material based on established metrics, while also attending very closely to the user's individual actions, desires, and historic interests, and adapting to them.
  •  
    When the system of scholarly communications was dependent on the physical movement of information goods, we did business in an era of information scarcity. As we become dependent on the digital movement of information goods, we find ourselves entering an era of information abundance. In the process, we are witnessing a radical shift in how we establish authority, significance, and even scholarly validity. That has major implications for, in particular, the humanities and social sciences.
Barbara Lindsey

Shirky: A Group Is Its Own Worst Enemy - 1 views

  • April 24, 2003
  • I want to talk about a pattern I've seen over and over again in social software that supports large and long-lived groups.
  • definition of social software
  • ...59 more annotations...
  • It's software that supports group interaction
  • how radical that pattern is. The Internet supports lots of communications patterns, principally point-to-point and two-way, one-to-many outbound, and many-to-many two-way.
  • Prior to the Internet, the last technology that had any real effect on the way people sat down and talked together was the table.
  • We've had social software for 40 years at most, dated from the Plato BBS system, and we've only had 10 years or so of widespread availability, so we're just finding out what works. We're still learning how to make these kinds of things.
  • If it's a cluster of half a dozen LiveJournal users, on the other hand, talking about their lives with one another, that's social. So, again, weblogs are not necessarily social, although they can support social patterns.
  • So email doesn't necessarily support social patterns, group patterns, although it can. Ditto a weblog. If I'm Glenn Reynolds, and I'm publishing something with Comments Off and reaching a million users a month, that's really broadcast.
  • So there's this very complicated moment of a group coming together, where enough individuals, for whatever reason, sort of agree that something worthwhile is happening, and the decision they make at that moment is: This is good and must be protected. And at that moment, even if it's subconscious, you start getting group effects. And the effects that we've seen come up over and over and over again in online communities.
  • You are at a party, and you get bored. You say "This isn't doing it for me anymore. I'd rather be someplace else.
  • The party fails to meet some threshold of interest. And then a really remarkable thing happens: You don't leave.
  • That kind of social stickiness is what Bion is talking about.
  • Twenty minutes later, one person stands up and gets their coat, and what happens? Suddenly everyone is getting their coats on, all at the same time. Which means that everyone had decided that the party was not for them, and no one had done anything about it, until finally this triggering event let the air out of the group, and everyone kind of felt okay about leaving.
  • This effect is so steady it's sometimes called the paradox of groups.
  • what's less obvious is that there are no members without a group.
  • there are some very specific patterns that they're entering into to defeat the ostensible purpose of the group meeting together. And he detailed three patterns.
  • The first is sex talk,
  • second basic pattern
  • The identification and vilification of external enemies.
  • So even if someone isn't really your enemy, identifying them as an enemy can cause a pleasant sense of group cohesion. And groups often gravitate towards members who are the most paranoid and make them leaders, because those are the people who are best at identifying external enemies.
  • third pattern Bion identified: Religious veneration
  • The religious pattern is, essentially, we have nominated something that's beyond critique.
  • So these are human patterns that have shown up on the Internet, not because of the software, but because it's being used by humans. Bion has identified this possibility of groups sandbagging their sophisticated goals with these basic urges. And what he finally came to, in analyzing this tension, is that group structure is necessary. Robert's Rules of Order are necessary. Constitutions are necessary. Norms, rituals, laws, the whole list of ways that we say, out of the universe of possible behaviors, we're going to draw a relatively small circle around the acceptable ones.
  • He said the group structure is necessary to defend the group from itself. Group structure exists to keep a group on target, on track, on message, on charter, whatever. To keep a group focused on its own sophisticated goals and to keep a group from sliding into these basic patterns. Group structure defends the group from the action of its own members.
  • technical and social issues are deeply intertwined. There's no way to completely separate them.
  • Some of the users wanted the system to continue to exist and to provide a forum for discussion. And other of the users, the high school boys, either didn't care or were actively inimical. And the system provided no way for the former group to defend itself from the latter.
  • What matters is, a group designed this and then was unable, in the context they'd set up, partly a technical and partly a social context, to save it from this attack from within. And attack from within is what matters.
  • This pattern has happened over and over and over again. Someone built the system, they assumed certain user behaviors. The users came on and exhibited different behaviors. And the people running the system discovered to their horror that the technological and social issues could not in fact be decoupled.
  • nd the worst crisis is the first crisis, because it's not just "We need to have some rules." It's also "We need to have some rules for making some rules." And this is what we see over and over again in large and long-lived social software systems. Constitutions are a necessary component of large, long-lived, heterogenous groups.
  • As a group commits to its existence as a group, and begins to think that the group is good or important, the chance that they will begin to call for additional structure, in order to defend themselves from themselves, gets very, very high.
  • The downside of going for size and scale above all else is that the dense, interconnected pattern that drives group conversation and collaboration isn't supportable at any large scale. Less is different -- small groups of people can engage in kinds of interaction that large groups can't. And so we blew past that interesting scale of small groups. Larger than a dozen, smaller than a few hundred, where people can actually have these conversational forms that can't be supported when you're talking about tens of thousands or millions of users, at least in a single group.
  • So the first answer to Why Now? is simply "Because it's time." I can't tell you why it took as long for weblogs to happen as it did, except to say it had absolutely nothing to do with technology. We had every bit of technology we needed to do weblogs the day Mosaic launched the first forms-capable browser. Every single piece of it was right there. Instead, we got Geocities. Why did we get Geocities and not weblogs? We didn't know what we were doing.
  • It took a long time to figure out that people talking to one another, instead of simply uploading badly-scanned photos of their cats, would be a useful pattern. We got the weblog pattern in around '96 with Drudge. We got weblog platforms starting in '98. The thing really was taking off in 2000. By last year, everyone realized: Omigod, this thing is going mainstream, and it's going to change everything.
  • Why was there an eight-year gap between a forms-capable browser and the Pepys diaries? I don't know. It just takes a while for people to get used to these ideas. So, first of all, this is a revolution in part because it is a revolution. We've internalized the ideas and people are now working with them. Second, the things that people are now building are web-native.
  • A weblog is web-native. It's the web all the way in. A wiki is a web-native way of hosting collaboration. It's lightweight, it's loosely coupled, it's easy to extend, it's easy to break down. And it's not just the surface, like oh, you can just do things in a form. It assumes http is transport. It assumes markup in the coding. RSS is a web-native way of doing syndication. So we're taking all of these tools and we're extending them in a way that lets us build new things really quickly.
  • Third, in David Weinberger's felicitous phrase, we can now start to have a Small Pieces Loosely Joined pattern.
  • You can say, in the conference call or the chat: "Go over to the wiki and look at this."
  • It's just three little pieces of software laid next to each other and held together with a little bit of social glue. This is an incredibly powerful pattern. It's different from: Let's take the Lotus juggernaut and add a web front-end.
  • And finally, and this is the thing that I think is the real freakout, is ubiquity.
  • In many situations, all people have access to the network. And "all" is a different kind of amount than "most." "All" lets you start taking things for granted.
  • But for some groups of people -- students, people in high-tech offices, knowledge workers -- everyone they work with is online. Everyone they're friends with is online. Everyone in their family is online.
  • And this pattern of ubiquity lets you start taking this for granted.
  • There's a second kind of ubiquity, which is the kind we're enjoying here thanks to Wifi. If you assume whenever a group of people are gathered together, that they can be both face to face and online at the same time, you can start to do different kinds of things. I now don't run a meeting without either having a chat room or a wiki up and running. Three weeks ago I ran a meeting for the Library of Congress. We had a wiki, set up by Socialtext, to capture a large and very dense amount of technical information on long-term digital preservation.
  • The people who organized the meeting had never used a wiki before, and now the Library of Congress is talking as if they always had a wiki for their meetings, and are assuming it's going to be at the next meeting as well -- the wiki went from novel to normal in a couple of days.
  • It really quickly becomes an assumption that a group can do things like "Oh, I took my PowerPoint slides, I showed them, and then I dumped them into the wiki. So now you can get at them." It becomes a sort of shared repository for group memory. This is new. These kinds of ubiquity, both everyone is online, and everyone who's in a room can be online together at the same time, can lead to new patterns.
  • "What is required to make a large, long-lived online group successful?" and I think I can now answer with some confidence: "It depends."
  • The normal experience of social software is failure. If you go into Yahoo groups and you map out the subscriptions, it is, unsurprisingly, a power law. There's a small number of highly populated groups, a moderate number of moderately populated groups, and this long, flat tail of failure. And the failure is inevitably more than 50% of the total mailing lists in any category. So it's not like a cake recipe. There's nothing you can do to make it come out right every time.
  • Of the things you have to accept, the first is that you cannot completely separate technical and social issues.
  • So the group is real. It will exhibit emergent effects. It can't be ignored, and it can't be programmed, which means you have an ongoing issue. And the best pattern, or at least the pattern that's worked the most often, is to put into the hands of the group itself the responsibility for defining what value is, and defending that value, rather than trying to ascribe those things in the software upfront.
  • Members are different than users. A pattern will arise in which there is some group of users that cares more than average about the integrity and success of the group as a whole. And that becomes your core group, Art Kleiner's phrase for "the group within the group that matters most."
  • But in all successful online communities that I've looked at, a core group arises that cares about and gardens effectively. Gardens the environment, to keep it growing, to keep it healthy.
  • The core group has rights that trump individual rights in some situations
  • And absolute citizenship, with the idea that if you can log in, you are a citizen, is a harmful pattern, because it is the tyranny of the majority. So the core group needs ways to defend itself -- both in getting started and because of the effects I talked about earlier -- the core group needs to defend itself so that it can stay on its sophisticated goals and away from its basic instincts.
  • All groups of any integrity have a constitution. The constitution is always partly formal and partly informal. A
  • If you were going to build a piece of social software to support large and long-lived groups, what would you design for? The first thing you would design for is handles the user can invest in.
  • Second, you have to design a way for there to be members in good standing. Have to design some way in which good works get recognized. The minimal way is, posts appear with identity.
  • Three, you need barriers to participation.
  • It has to be hard to do at least some things on the system for some users, or the core group will not have the tools that they need to defend themselves.
  • The user of social software is the group, not the individual.
  • Reputation is not necessarily portable from one situation to another
  • If you want a good reputation system, just let me remember who you are. And if you do me a favor, I'll remember it. And I won't store it in the front of my brain, I'll store it here, in the back. I'll just get a good feeling next time I get email from you; I won't even remember why. And if you do me a disservice and I get email from you, my temples will start to throb, and I won't even remember why. If you give users a way of remembering one another, reputation will happen,
Barbara Lindsey

Zotero Style Repository - 1 views

  • 2008-12-23 06:40:00
    • Barbara Lindsey
       
      This is my sticky note
  •  
    A list of bibliographic citations
  •  
    Is this list complete?
Barbara Lindsey

2010 Horizon Report » One Year or Less: Open Content - 0 views

  • The movement toward open content reflects a growing shift in the way academics in many parts of the world are conceptualizing education to a view that is more about the process of learning than the information conveyed in their courses. Information is everywhere; the challenge is to make effective use of it.
  • As customizable educational content is made increasingly available for free over the Internet, students are learning not only the material, but also skills related to finding, evaluating, interpreting, and repurposing the resources they are studying in partnership with their teachers.
  • collective knowledge and the sharing and reuse of learning and scholarly content,
  • ...16 more annotations...
  • the notion of open content is to take advantage of the Internet as a global dissemination platform for collective knowledge and wisdom, and to design learning experiences that maximize the use of it.
  • The role of open content producers has evolved as well, away from the idea of authoritative repositories of content and towards the broader notion of content being both free and ubiquitous.
  • schools like Tufts University (and many others) now consider making their course materials available to the public a social responsibility.
  • Many believe that reward structures that support the sharing of work in progress, ongoing research, highly collaborative projects, and a broad view of what constitutes scholarly publication are key challenges that institutions need to solve.
  • learning to find useful resources within a given discipline, assess the quality of content available, and repurpose them in support of a learning or research objective are in and of themselves valuable skills for any emerging scholar, and many adherents of open content list that aspect among the reasons they support the use of shareable materials.
  • Open content shifts the learning equation in a number of interesting ways; the most important is that its use promotes a set of skills that are critical in maintaining currency in any discipline — the ability to find, evaluate, and put new information to use.
  • Communities of practice and learning communities have formed around open content in a great many disciplines, and provide practitioners and independent learners alike an avenue for continuing education.
  • Art History. Smarthistory, an open educational resource dedicated to the study of art, seeks to replace traditional art history textbooks with an interactive, well-organized website. Search by time period, style, or artist (http://smarthistory.org).
  • American Literature before 1860 http://enh241.wetpaint.com Students in this course, held at Mesa Community College, contribute to the open course material as part of their research. MCC also features a number of lectures on YouTube (see http://www.youtube.com/user/mesacc#p/p).
  • Carnegie Mellon University’s Open Learning Initiative http://oli.web.cmu.edu/openlearning/ The Open Learning Initiative offers instructor-led and self-paced courses; any instructor may teach with the materials, regardless of affiliation. In addition, the courses include student assessment and intelligent tutoring capability.
  • Connexions http://cnx.org Connexions offers small modules of information and encourages users to piece together these chunks to meet their individual needs.
  • eScholarship: University of California http://escholarship.org/about_escholarship.html eScholarship provides peer review and publishing for scholarly articles, books, and papers, using an open content model. The service also includes tools for dissemination and research.
  • MIT OpenCourseWare http://ocw.mit.edu The Massachusetts Institute of Technology publishes lectures and materials from most of its undergraduate and graduate courses online, where they are freely available for self-study.
  • Open.Michigan’s dScribe Project https://open.umich.edu/projects/oer.php The University of Michigan’s Open.Michigan initiative houses several open content projects. One, dScribe, is a student-centered approach to creating open content. Students work with faculty to select and vet resources, easing the staffing and cost burden of content creation while involving the students in developing materials for themselves and their peers.
  • Center for Social Media Publishes New Code of Best Practices in OCW http://criticalcommons.org/blog/content/center-for-social-media-publishes-new-code-of-best-practices-in-ocw (Critical Commons, 25 October 2009.) The advocacy group Critical Commons seeks to promote the use of media in open educational resources. Their Code of Best Practices in Fair Use for OpenCourseWare is a guide for content developers who want to include fair-use material in their offerings.
  • Flat World Knowledge: A Disruptive Business Model http://industry.bnet.com/media/10003790/flat-world-knowledge-a-disruptive-business-model/ (David Weir, BNET, 20 August 2009.) Flat World Knowledge is enjoying rapid growth, from 1,000 students in the spring of 2009 to 40,000 in the fall semester using their materials. The company’s business model pays a higher royalty percentage to textbook authors and charges students a great deal less than traditional publishers.
Barbara Lindsey

News: The Web of Babel - Inside Higher Ed - 0 views

  • Some adventurous professors have used Twitter as a teaching tool for at least a few years. At a presentation at Educause in 2009, W. Gardner Campbell, director of the academy of teaching and learning at Baylor University, extolled the virtues of allowing students to pose questions to the professor and each other — an important part of the thinking and learning process — without having to raise their hands to do so immediately and aloud. And in November, a group of professors published a scientific paper suggesting that bringing Twitter into the learning process might boost student engagement and performance.
  • But while Lomicka and her tech-forward peers are not advocating that every college go the way of Chapel Hill, they are finding out that some relatively novel teaching technologies that are used by academics of all stripes, such as Twitter and iTunes U, are particularly useful for teaching languages.
  • At Emory University, language instructional content is far and away the biggest export of its public repository on iTunes U, where visitors from around the world have downloaded more than 10 million files since Emory opened the site in 2007.
  • ...4 more annotations...
  • Language content makes up about 95 percent of the downloads from the Emory iTunes U site.
  • the most popular content is audio and video files that were originally developed not for a general audience, but by professors as supplements to college-level coursework,
  • Because language demonstrations often require audio and sometimes video components (e.g., tutorials on how to write in a character-based alphabet), and students often like to practice while on the move, iTunes is in many ways an ideal vehicle for language-based instructional content.
  • what we do offer is an online supplement that enhances what happens both in the classroom and in foreign study in the culture — and it is always there as a resource for our students, because it’s online.”
1 - 11 of 11
Showing 20 items per page