Skip to main content

Home/ beyondwebct/ Group items tagged web

Rss Feed Group items tagged

Barbara Lindsey

Web 2.0 Storytelling: Emergence of a New Genre (EDUCAUSE Review) | EDUCAUSE - 2 views

  • A story is told by one person or by a creative team to an audience that is usually quiet, even receptive. Or at least that’s what a story used to be, and that’s how a story used to be told. Today, with digital networks and social media, this pattern is changing. Stories now are open-ended, branching, hyperlinked, cross-media, participatory, exploratory, and unpredictable. And they are told in new ways: Web 2.0 storytelling picks up these new types of stories and runs with them, accelerating the pace of creation and participation while revealing new directions for narratives to flow.
    • Barbara Lindsey
       
      Do you agree with this statement?
    • loisramirez
       
      I also agree with the statement. A story in this age can take a life of it's own (or many, depending one the variations created), it allows a constant input by others and consequently the evolution of the text and the author as well.
  • To further define the term, we should begin by explaining what we mean by its first part: Web 2.0. Tim O'Reilly coined Web 2.0 in 2004,1 but the label remains difficult to acceptably define. For our present discussion, we will identify two essential features that are useful in distinguishing Web 2.0 projects and platforms from the rest of the web: microcontent and social media.2
  • creating a website through Web 2.0 tools is a radically different matter compared with the days of HTML hand-coding and of moving files with FTP clients.
  • ...44 more annotations...
  • out of those manifold ways of writing and showing have emerged new practices for telling stories.
  • Web 2.0 platforms are often structured to be organized around people rather than the traditional computer hierarchies of directory trees.
    • loisramirez
       
      I think this is a very important feature, since the web is not as static anymore and more people friendly, we as users feel more encourage to collaborate and create our own content.
  • Websites designed in the 1990s and later offered few connecting points for individuals, generally speaking, other than perhaps a guestbook or a link to an e-mail address. But Web 2.0 tools are built to combine microcontent from different users with a shared interest:
  • If readers closely examine a Web 2.0 project, they will find that it is often touched by multiple people, whether in the content creation or via associated comments or discussion areas. If they participate actively, by contributing content, we have what many call social media.
  • But Web 2.0's lowered bar to content creation, combined with increased social connectivity, ramps up the ease and number of such conversations, which are able to extend outside the bounds of a single environment.
    • Barbara Lindsey
       
      Does the definition of Web 2.0 given in this article help you to better understand your experiences thus far in this course?
  • Another influential factor of Web 2.0 is findability: the use of comprehensive search tools that help story creators (and readers) quickly locate related micocontent with just a few keywords typed into a search field.
  • Social bookmarking and content tagging
  • the "art of conveying events in words, images, and sounds often by improvisation or embellishment."4 Annette Simmons sees the storyteller’s empathy and sensory detail as crucial to "the unique capability to tap into a complex situation we have all experienced and which we all recognize."5
    • loisramirez
       
      I also agree with this comment, something as simple as a keyword can trigger a memory and bring back information that we have learned.
  • Web 2.0 stories are often broader: they can represent history, fantasy, a presentation, a puzzle, a message, or something that blurs the boundaries of reality and fiction.
  • On one level, web users experienced a great deal of digital narratives created in non-web venues but published in HTML, such as embedded audio clips, streaming video, and animation through the Flash plug-in. On another level, they experienced stories using web pages as hypertext lexia, chunks of content connected by hyperlinks.
  • While HTML narratives continued to be produced, digital storytelling by video also began, drawing on groundbreaking video projects from the 1970s.
  • By the time of the emergence of blogs and YouTube as cultural media outlets, Tim O'Reilly's naming of Web 2.0, and the advent of social media, storytelling with digital tools had been at work for nearly a generation.
  • Starting from our definitions, we should expect Web 2.0 storytelling to consist of Web 2.0 practices.
  • In each of these cases, the relative ease of creating web content enabled social connections around and to story materials.
  • Web 2.0 creators have many options about the paths to set before their users. Web 2.0 storytelling can be fully hypertextual in its multilinearity. At any time, the audience can go out of the bounds of the story to research information (e.g., checking names in Google searches or looking for background information in Wikipedia).
  • User-generated content is a key element of Web 2.0 and can often enter into these stories. A reader can add content into story platforms directly: editing a wiki page, commenting on a post, replying in a Twitter feed, posting a video response in YouTube. Those interactions fold into the experience of the overall story from the perspective of subsequent readers.
  • On a less complex level, consider the 9th Btn Y & L War Diaries blog project, which posts diary entries from a World War I veteran. A June 2008 post (http://yldiaries.blogspot.com/2008_06_01_archive.html) contains a full wartime document, but the set of comments from others (seven, as of this writing) offer foreshadowing, explication of terms, and context.
    • Barbara Lindsey
       
      Consider how these new media create rich dissertation and research opportunities.
  • As with the rest of Web 2.0, it is up to readers and viewers to analyze and interpret such content and usually to do so collaboratively.
  • At times, this distributed art form can range beyond the immediate control of a creator.
  • Creators can stage content from different sites.
  • Other forms leverage the Web 2.0 strategies of aggregating large amounts of microcontent and creatively selecting patterns out of an almost unfathomable volume of information.
  • The Twitter content form (140-character microstories) permits stories to be told in serialized portions spread over time.
    • loisramirez
       
      It is also a great way to practice not only creative writing but due to the 140 character limitation; this is a new challenge for a writer, how to say a lot in a just a few words.
  • It also poses several challenges: to what extent can we fragment (or ‘microchunk,’ in the latest parlance) literature before it becomes incoherent? How many media can literature be forced into—if, indeed, there is any limit?"
  • Facebook application that remixes photos drawn from Flickr (based on tags) with a set of texts that generate a dynamic graphic novel.
  • movie trailer recuts
  • At a different—perhaps meta—level, the boundaries of Web 2.0 stories are not necessarily clear. A story's boundaries are clear when it is self-contained, say in a DVD or XBox360 game. But can we know for sure that all the followers of a story's Twitter feed, for example, are people who are not involved directly in the project? Turning this question around, how do we know that we've taken the right measure of just how far a story goes, when we could be missing one character's blog or a setting description carefully maintained by the author on Wikipedia?
  • The Beast was described by its developer, Sean Stewart: “We would tell a story that was not bound by communication platform: it would come at you over the web, by email, via fax and phone and billboard and TV and newspaper, SMS and skywriting and smoke signals too if we could figure out how.
  • instead of telling a story, we would present the evidence of that story, and let the players tell it to themselves.”15
    • Barbara Lindsey
       
      How might your students who come to your courses with these kinds of experiences impact the way you present your content?
  • In addition, the project served as an illustrative example of the fact that no one can know about all of the possible web tools that are available.
    • Barbara Lindsey
       
      How might we address this conundrum?
  • web video storytelling, primarily through YouTube
  • Web 2.0 storytelling offers two main applications for colleges and universities: as composition platform and as curricular object.
  • Students can use blogs as character studies.
  • The reader is driven to read more, not only within the rest of that post but also across the other sites of the story: the archive of posts so far, the MySpace page, the resources copied and pointed to. Perhaps the reader ranges beyond the site, to the rest of the research world—maybe he or she even composes a response in some Web 2.0 venue.
  • Yet the blog form, which accentuates this narrative, is accessible to anyone with a browser. Examples like Project 1968 offer ready models for aspiring writers to learn from. Even though the purpose of Project 1968 is not immediately tied to a class, it is a fine example for all sorts of curricular instances, from history to political science, creative writing to gender studies, sociology to economics.
  • it’s worth remembering that using Web 2.0 storytelling is partly a matter of scale. Some projects can be Web 2.0 stories, while others integrate Web 2.0 storytelling practices.
  • Lecturers are familiar with telling stories as examples, as a way to get a subject across. They end discussions with a challenging question and create characters to embody parts of content (political actors, scientists, composite types). Imagine applying those habits to a class Twitter feed or Facebook group.
  • For narrative studies, Web 2.0 stories offer an unusual blend of formal features, from the blurry boundaries around each story to questions of chronology.
  • An epistolary novel, trial documents, a lab experiment, or a soldier's diaries—for example, WW1: Experiences of an English Soldier (http://wwar1.blogspot.com/)—come to life in this new format.
  • epigrams are well suited to being republished or published by microblogging tools, which focus the reader’s attention on these compressed phases. An example is the posting of Oscar Wilde’s Phrases and Philosophies for the Use of the Young (1894), on Twitter (http://twitter.com/oscarwilde). Other compressed forms of writing can be microblogged also, such as Félix Fénéon's Novels in Three Lines (1906), also on Twitter (http://twitter.com/novelsin3lines). As Dan Visel observed of the latter project: “Fénéon . . . was secretly a master of miniaturized text. . . . Fénéon's hypercompression lends itself to Twitter. In a book, these pieces don't quite have space to breathe; they're crowded by each other, and it's more difficult for the reader to savor them individually. As Twitter posts, they're perfectly self-contained, as they would have been when they appeared as feuilleton.”21
  • A publicly shared Web 2.0 story, created by students for a class, afterward becomes something that other students can explore. Put another way, this learning tool can produce materials that subsequently will be available as learning objects.
  • We expect to see new forms develop from older ones as this narrative world grows—even e-mail might become a new storytelling tool.22 Moreover, these storytelling strategies could be supplanted completely by some semantic platform currently under development. Large-scale gaming might become a more popular engine for content creation. And mobile devices could make microcontent the preferred way to experience digital stories.
  • perhaps the best approach for educators is simply to give Web 2.0 storytelling a try and see what happens. We invite you to jump down the rabbit hole. Add a photo to Flickr and use that as a writing prompt. Flesh out a character in Twitter. Follow a drama unfolding on YouTube. See how a wiki supports the gradual development of a setting. Then share with all of us what you have learned about this new way of telling, and listening to, stories.
  • The interwoven characters, relationships, settings, and scenes that result are the stuff of stories, regardless of how closely mapped onto reality they might be; this also distinguishes a Web 2.0 story from other blogging forms, such as political or project sites (except as satire or criticism!).
  • in sharp contrast to the singular flow of digital storytelling. In the latter form, authors create linear narratives, bound to the clear, unitary, and unidirectional timeline of the video format and the traditional story arc. Web 2.0 narratives can follow that timeline, and podcasts in particular must do so. But they can also link in multiple directions.
  •  
    By Bryan Alexander and Alan Levine
Barbara Lindsey

Web 2.0: A New Wave of Innovation for Teaching and Learning? (EDUCAUSE Review) | EDUCAU... - 0 views

  • Web 2.0. It is about no single new development. Moreover, the term is often applied to a heterogeneous mix of relatively familiar and also very emergent technologies
  • Ultimately, the label “Web 2.0” is far less important than the concepts, projects, and practices included in its scope.
  • Social software has emerged as a major component of the Web 2.0 movement. The idea dates as far back as the 1960s and JCR Licklider’s thoughts on using networked computing to connect people in order to boost their knowledge and their ability to learn. The Internet technologies of the subsequent generation have been profoundly social, as listservs, Usenet groups, discussion software, groupware, and Web-based communities have linked people around the world.
  • ...26 more annotations...
  • It is true that blogs are Web pages, but their reverse-chronological structure implies a different rhetorical purpose than a Web page, which has no inherent timeliness. That altered rhetoric helped shape a different audience, the blogging public, with its emergent social practices of blogrolling, extensive hyperlinking, and discussion threads attached not to pages but to content chunks within them. Reading and searching this world is significantly different from searching the entire Web world. Still, social software does not indicate a sharp break with the old but, rather, the gradual emergence of a new type of practice.
  • Rather than following the notion of the Web as book, they are predicated on microcontent. Blogs are about posts, not pages. Wikis are streams of conversation, revision, amendment, and truncation. Podcasts are shuttled between Web sites, RSS feeds, and diverse players. These content blocks can be saved, summarized, addressed, copied, quoted, and built into new projects. Browsers respond to this boom in microcontent with bookmarklets in toolbars, letting users fling something from one page into a Web service that yields up another page. AJAX-style pages feed content bits into pages without reloading them, like the frames of old but without such blatant seams. They combine the widely used, open XML standard with Java functions.3 Google Maps is a popular example of this, smoothly drawing directional information and satellite imagery down into a browser.
  • Web 2.0 builds on this original microcontent drive, with users developing Web content, often collaboratively and often open to the world.
  • openness remains a hallmark of this emergent movement, both ideologically and technologically.
  • Drawing on the “wisdom of crowds” argument, Web 2.0 services respond more deeply to users than Web 1.0 services. A leading form of this is a controversial new form of metadata, the folksonomy.
  • Third, people tend to tag socially. That is, they learn from other taggers and respond to other, published groups of tags, or “tagsets.”
  • First, users actually use tags.
  • Social bookmarking is one of the signature Web 2.0 categories, one that did not exist a few years ago and that is now represented by dozens of projects.
  • This is classic social software—and a rare case of people connecting through shared metadata.
  • RawSugar (http://www.rawsugar.com/) and several others expand user personalization. They can present a user’s picture, some background about the person, a feed of their interests, and so on, creating a broader base for bookmark publishing and sharing. This may extend the appeal of the practice to those who find the focus of del.icio.us too narrow. In this way too, a Web 2.0 project learns from others—here, blogs and social networking tools.
  • How can social bookmarking play a role in higher education? Pedagogical applications stem from their affordance of collaborative information discovery.
  • First, they act as an “outboard memory,” a location to store links that might be lost to time, scattered across different browser bookmark settings, or distributed in e-mails, printouts, and Web links. Second, finding people with related interests can magnify one’s work by learning from others or by leading to new collaborations. Third, the practice of user-created tagging can offer new perspectives on one’s research, as clusters of tags reveal patterns (or absences) not immediately visible by examining one of several URLs. Fourth, the ability to create multi-authored bookmark pages can be useful for team projects, as each member can upload resources discovered, no matter their location or timing. Tagging can then surface individual perspectives within the collective. Fifth, following a bookmark site gives insights into the owner’s (or owners’) research, which could play well in a classroom setting as an instructor tracks students’ progress. Students, in turn, can learn from their professor’s discoveries.
  • After e-mail lists, discussion forums, groupware, documents edited and exchanged between individuals, and blogs, perhaps the writing application most thoroughly grounded in social interaction is the wiki. Wiki pages allow users to quickly edit their content from within the browser window.11 They originally hit the Web in the late 1990s (another sign that Web 2.0 is emergent and historical, not a brand-new thing)
  • How do social writing platforms intersect with the world of higher education? They appear to be logistically useful tools for a variety of campus needs, from student group learning to faculty department work to staff collaborations. Pedagogically, one can imagine writing exercises based on these tools, building on the established body of collaborative composition practice. These services offer an alternative platform for peer editing, supporting the now-traditional elements of computer-mediated writing—asynchronous writing, groupwork for distributed members
  • Blogging has become, in many ways, the signature item of social software, being a form of digital writing that has grown rapidly into an influential force in many venues, both on- and off-line. One reason for the popularity of blogs is the way they embody the read/write Web notion. Readers can push back on a blog post by commenting on it. These comments are then addressable, forming new microcontent. Web services have grown up around blog comments, most recently in the form of aggregation tools, such as coComment (http://www.cocomment.com/). CoComment lets users keep track of their comments across myriad sites, via a tiny bookmarklet and a single Web page.
  • Technorati (http://technorati.com/) and IceRocket (http://icerocket.com/) head in the opposite direction of these sites, searching for who (usually a blogger) has recently linked to a specific item or site. Technorati is perhaps the most famous blog-search tool. Among other functions, it has emphasized tagging as part of search and discovery, recommending (and rewarding) users who add tags to their blog posts. Bloggers can register their site for free with Technorati; their posts will then be searchable by content and supplemental tags.
  • Many of these services allow users to save their searches as RSS feeds to be returned to and examined in an RSS reader, such as Bloglines (http://www.bloglines.com/) or NetNewsWire (http://ranchero.com/netnewswire/). This subtle ability is neatly recursive in Web 2.0 terms, since it lets users create microcontent (RSS search terms) about microcontent (blog posts). Being merely text strings, such search feeds are shareable in all sorts of ways, so one can imagine collaborative research projects based on growing swarms of these feeds—social bookmarking plus social search.
  • Students can search the blogosphere for political commentary, current cultural items, public developments in science, business news, and so on.
  • The ability to save and share a search, and in the case of PubSub, to literally search the future, lets students and faculty follow a search over time, perhaps across a span of weeks in a semester. As the live content changes, tools like Waypath’s topic stream, BlogPulse’s trend visualizations, or DayPop’s word generator let a student analyze how a story, topic, idea, or discussion changes over time. Furthermore, the social nature of these tools means that collaboration between classes, departments, campuses, or regions is easily supported. One could imagine faculty and students across the United States following, for example, the career of an Islamic feminist or the outcome of a genomic patent and discussing the issue through these and other Web 2.0 tools. Such a collaboration could, in turn, be discovered, followed, and perhaps joined by students and faculty around the world. Extending the image, one can imagine such a social research object becoming a learning object or an alternative to courseware.
  • A glance at Blogdex offers a rough snapshot of what the blogosphere is tending to pay attention to.
  • A closer look at an individual Blogdex result reveals the blogs that link to a story. As we saw with del.icio.us, this publication of interest allows the user to follow up on commentary, to see why those links are there, and to learn about those doing the linking. Once again, this is a service that connects people through shared interest in information.
  • The rich search possibilities opened up by these tools can further enhance the pedagogy of current events. A political science class could explore different views of a news story through traditional media using Google News, then from the world of blogs via Memeorandum. A history class could use Blogdex in an exercise in thinking about worldviews. There are also possibilities for a campus information environment. What would a student newspaper look like, for example, with a section based on the Digg approach or the OhmyNews structure? Thematizing these tools as objects for academic scrutiny, the operation and success of such projects is worthy of study in numerous disciplines, from communication to media studies, sociology to computer science.
  • At the same time, many services are hosted externally to academia. They are the creations of enthusiasts or business enterprises and do not necessarily embrace the culture of higher education.
  • Lawrence Lessig, J. D. Lasica, and others remind us that as tools get easier to use and practices become more widespread, it also becomes easier for average citizens to commit copyright violations.19
    • Barbara Lindsey
       
      Which is why he led the Creative Commons Movement and why he exhorts us to re-imagine copyright.
  • Web 2.0’s lowered barrier to entry may influence a variety of cultural forms with powerful implications for education, from storytelling to classroom teaching to individual learning. It is much simpler to set up a del.icio.us tag for a topic one wants to pursue or to spin off a blog or blog departmental topic than it is to physically meet co-learners and experts in a classroom or even to track down a professor. Starting a wiki-level text entry is far easier than beginning an article or book.
  • How can higher education respond, when it offers a complex, contradictory mix of openness and restriction, public engagement and cloistering?
  •  
    Web 2.0. It is about no single new development. Moreover, the term is often applied to a heterogeneous mix of relatively familiar and also very emergent technologies
Barbara Lindsey

Web 2.0: What does it constitute? | 11 Feb 2008 | ComputerWeekly.com - 0 views

  • O'Reilly identified Google as "the standard bearer for Web 2.0", and pointed out the differences between it and predecessors such as Netscape, which tried to adapt for the web the business model established by Microsoft and other PC software suppliers.
  • Google "began its life as a native web application, never sold or packaged, but delivered as a service, with customers paying, directly or indirectly.
  • perpetual beta, as O'Reilly later dubbed it
  • ...13 more annotations...
  • Perhaps the most important breakthrough was Google's willingness to relinquish control of the user-end of the transaction, instead of trying to lock them in with proprietary technology and restrictive licensing
  • O'Reilly took a second Web 2.0 principle from Peer-to-Peer pioneer BitTorrent, which works by completely decentralising the delivery of files, with every client also functioning as a server. The more popular a file, is, the faster it can be served, since there are more users providing bandwidth and fragments of the file. Thus, "the service automatically gets better the more people use it".
  • Taking another model from open source, users are treated as "co-developers", actively encouraged to contribute, and monitored in real time to see what they are using, and how they are using it.
  • "Until Web 2.0 the learning curve to creating websites was quite high, complex, and a definite barrier to entry," says the third of our triumvirate of Tims, Tim Bray, director of Web Technologies at Sun Microsystems.
  • Web 2.0 takes some of its philosophical underpinning from James Surowiecki's book The Wisdom of Crowds, which asserts that the aggregated insights of large groups of diverse people can provide better answers and innovations than individual experts.
  • In practice, even fewer than 1% of people may be making a useful contribution - but these may be the most energetic and able members of a very large community. In 2006 1,000 people, just 0.003% of its users, contributed around two-thirds of Wikipedia's edits.
  • Ajax speeds up response times by enabling just part of a page to be updated, instead of downloading a whole new page. Nielsen's objections include that this breaks the "back" button - the ability to get back to where you've been, which Nielsen says is the second most used feature in Web navigation.
  • "Everybody who has a Web browser has got that platform," says Berners-Lee, in a podcast available on IBM's developerWorks site. "So the nice thing about it is when you do code up an Ajax implementation, other people can take it and play with it."
  • Web 2.0 is a step on the way to the Semantic Web, a long-standing W3C initiative to create a standards-based framework able to understand the links between data which is related in the real world, and follow that data wherever it resides, regardless of application and database boundaries.
  • The problem with Web 2.0, Pemberton says, is that it "partitions the web into a number of topical sub-webs, and locks you in, thereby reducing the value of the network as a whole."
  • How do you decide which social networking site to join? he asks. "Do you join several and repeat the work?" With the Semantic Web's Resource Description Framework (RDF), you won't need to sign up to separate networks, and can keep ownership of your data. "You could describe it as a CSS for meaning: it allows you to add a small layer of markup to your page that adds machine-readable semantics."
  • The problems with Web 2.0 lock-in which Pemberton describes, were illustrated when a prominent member of the active 1%, Robert Scoble, ran a routine called Plaxo to try to extract details of his 5,000 contacts from Facebook, in breach of the site's terms of use, and had his account disabled. Although he has apparently had his account reinstated, the furore has made the issue of Web 2.0 data ownership and portability fiercely topical.
  • when Google announced its OpenSocial set of APIs, which will enable developers to create portable applications and bridges between social networking websites, Facebook was not among those taking part. Four years after O'Reilly attempted to define Web 2.0, Google, it seems, remains the standard-bearer, while others are forgetting what it was supposed to be about.
Barbara Lindsey

Steve Hargadon: Web 2.0 Is the Future of Education - 0 views

  • The new Web, or Web 2.0, is a two-way medium, based on contribution, creation, and collaboration--often requiring only access to the Web and a browser.
  • when people ask me the answer to content overload, I tell them (counter-intuitively) that it is to produce more content. Because it is in the act of our becoming a creator that our relationship with content changes, and we become more engaged and more capable at the same time.
  • Imagine an electronic book that allows you to comment on a sentence, paragraph, or section of the book, and see the comments from other readers... to then actually be in an electronic dialog with those other readers. It's coming.
  • ...17 more annotations...
  • There is no question that historical eras favor certain personalities and types, and the age of the collaborator is here or coming, depending on where you sit. The era of trusted authority (Time magazine, for instance, when I was young) is giving way to an era of transparent and collaborative scholarship (Wikipedia). The expert is giving way to the collaborator, since 1 + 1 truly equals 3 in this realm.
  • The combination of 1) an increased ability to work on specialized topics by gathering teams from around the globe, and 2) the diversity of those collaborators, should bring with it an incredible amount of innovation.
  • That anyone, anywhere in the world, can study using over the material from over 1800 open courses at MIT is astounding, and it's only the start.
  • I believe that the read/write Web, or what we are calling Web 2.0, will culturally, socially, intellectually, and politically have a greater impact than the advent of the printing press
  • a study that showed that one of the strongest determinants of success in higher education is the ability to form or participate in study groups. In the video of his lecture he makes the point that study groups using electronic methods have almost the exact same results as physical study groups. The conclusion is somewhat stunning--electronic collaborative study technologies = success? Maybe not that simple, but the real-life conclusions here may dramatically alter how we view the structure of our educational institutions. JSB says that we move from thinking of knowledge as a "substance" that we transfer from student to teacher, to a social view of learning. Not "I think, therefore I am," but "We participate, therefore we are." From "access to information" to "access to people" (I find this stunning). From "learning about" to "learning to be." His discussions of the "apprenticeship" model of learning and how it's naturally being manifested on the front lines of the Internet (Open Source Software) are not to be missed.
  • "differentiated instruction" a reality that both parents and students will demand.
  • sites that combined several Web 2.0 tools together created the phenomenon of "social networking."
  • From consuming to producing * From authority to transparency * From the expert to the facilitator * From the lecture to the hallway * From "access to information" to "access to people" * From "learning about" to "learning to be" * From passive to passionate learning * From presentation to participation * From publication to conversation * From formal schooling to lifelong learning * From supply-push to demand-pull
  • The Answer to Information Overload Is to Produce More Information.
  • Participate
  • Learn About Web 2.0. I
  • * Lurk.
  • Teach Content Production.
  • * Make Education a Public Discussion.
  • * Help Build the New Playbook.
  • We've spent the last ten years teaching students how to protect themselves from inappropriate content – now we have to teach them to create appropriate content. They may be "digital natives," but their knowledge is surface level, and they desperately need training in real thinking skills. More than any other generation, they live lives that are largely separated from the adults around them, talking and texting on cell phones, and connecting online.
  • Those of you with suggestions of other resources, please post comments linking to them
Barbara Lindsey

Dr. Mashup; or, Why Educators Should Learn to Stop Worrying and Love the Remix | EDUCAU... - 0 views

  • A classroom portal that presents automatically updated syndicated resources from the campus library, news sources, student events, weblogs, and podcasts and that was built quickly using free tools.
  • Increasingly, it's not just works of art that are appropriated and remixed but the functionalities of online applications as well.
  • mashups involve the reuse, or remixing, of works of art, of content, and/or of data for purposes that usually were not intended or even imagined by the original creators.
  • ...31 more annotations...
  • hat, exactly, constitutes a valid, original work? What are the implications for how we assess and reward creativity? Can a college or university tap the same sources of innovative talent and energy as Google or Flickr? What are the risks of permitting or opening up to this activity?
    • Barbara Lindsey
       
      Good discussion point
  • Remix is the reworking or adaptation of an existing work. The remix may be subtle, or it may completely redefine how the work comes across. It may add elements from other works, but generally efforts are focused on creating an alternate version of the original. A mashup, on the other hand, involves the combination of two or more works that may be very different from one another. In this article, I will apply these terms both to content remixes and mashups, which originated as a music form but now could describe the mixing of any number of digital media sources, and to data mashups, which combine the data and functionalities of two or more Web applications.
  • Harper's article "The Ecstasy of Influence," the novelist Jonathan Lethem imaginatively reviews the history of appropriation and recasts it as essential to the act of creation.3
  • Lethem's article is a must-read for anyone with an interest in the history of ideas, creativity, and intellectual property. It brilliantly synthesizes multiple disciplines and perspectives into a wonderfully readable and compelling argument. It is also, as the subtitle of his article acknowledges, "a plagiarism." Virtually every passage is a direct lift from another source, as the author explains in his "Key," which gives the source for every line he "stole, warped, and cobbled together." (He also revised "nearly every sentence" at least slightly.) Lethem's ideas noted in the paragraph above were appropriated from Siva Vaidhyanathan, Craig Baldwin, Richard Posner, and George L. Dillon.
  • Reading Walter Benjamin's highly influential 1936 essay "The Work of Art in the Age of Mechanical Reproduction,"4 it's clear that the profound effects of reproductive technology were obvious at that time. As Gould argued in 1964 (influenced by theorists such as Marshall McLuhan5), changes in how art is produced, distributed, and consumed in the electronic age have deep effects on the character of the art itself.
  • Yet the technology developments of the past century have clearly corresponded with a new attitude toward the "aura" associated with a work of invention and with more aggressive attitudes toward appropriation. It's no mere coincidence that the rise of modernist genres using collage techniques and more fragmented structures accompanied the emergence of photography and audio recording.
  • Educational technologists may wonder if "remix" or "content mashup" are just hipper-sounding versions of the learning objects vision that has absorbed so much energy from so many talented people—with mostly disappointing results.
  • The question is, why should a culture of remix take hold when the learning object economy never did?
  • when most learning object repositories were floundering, resource-sharing services such as del.icio.us and Flickr were enjoying phenomenal growth, with their user communities eagerly contributing heaps of useful metadata via simple folksonomy-oriented tagging systems.
  • the standards/practices relationship implicit in the learning objects model has been reversed. With only the noblest of intentions, proponents of learning objects (and I was one of them) went at the problem of promoting reuse by establishing an arduous and complex set of interoperability standards and then working to persuade others to adopt those standards. Educators were asked to take on complex and ill-defined tasks in exchange for an uncertain payoff. Not surprisingly, almost all of them passed.
  • Discoverable Resources
  • Educators might justifiably argue that their materials are more authoritative, reliable, and instructionally sound than those found on the wider Web, but those materials are effectively rendered invisible and inaccessible if they are locked inside course management systems.
  • It's a dirty but open secret that many courses in private environments use copyrighted third-party materials in a way that pushes the limits of fair use—third-party IP is a big reason why many courses cannot easily be made open.
  • The potential payoff for using open and discoverable resources, open and transparent licensing, and open and remixable formats is huge: more reuse means that more dynamic content is being produced more economically, even if the reuse happens only within an organization. And when remixing happens in a social context on the open web, people learn from each other's process.
  • Part of making a resource reusable involves making the right choices for file formats.
  • To facilitate the remixing of materials, educators may want to consider making the source files that were used to create a piece of multimedia available along with the finished result.
  • In addition to choosing the right file format and perhaps offering the original sources, another issue to consider when publishing content online is the critical question: "Is there an RSS feed available?" If so, conversion tools such as Feed2JS (http://www.feed2JS.org) allow for the republication of RSS-ified content in any HTML Web environment, including a course management system, simply by copying and pasting a few lines of JavaScript code. When an original source syndicated with RSS is updated, that update is automatically rendered anywhere it has been republished.
  • Jack Schofield
  • Guardian Unlimited
  • "An API provides an interface and a set of rules that make it much easier to extract data from a website. It's a bit like a record company releasing the vocals, guitars and drums as separate tracks, so you would not have to use digital processing to extract the parts you wanted."1
  • What's new about mashed-up application development? In a sense, the factors that have promoted this approach are the same ones that have changed so much else about Web culture in recent years. Essential hardware and software has gotten more powerful and for the most part cheaper, while access to high-speed connectivity and the enhanced quality of online applications like Google Docs have improved to the point that Tim O'Reilly and others can talk of "the emergent Internet operating system."15 The growth of user-centered technologies such as blogs have fostered a DIY ("do it yourself") culture that increasingly sees online interaction as something that can be personalized and adapted on the individual level. As described earlier, light syndication and service models such as RSS have made it easier and faster than ever to create simple integrations of diverse media types. David Berlind, executive editor of ZDNet, explains: "With mashups, fewer technical skills are needed to become a developer than ever. Not only that, the simplest ones can be done in 10 or 15 minutes. Before, you had to be a pretty decent code jockey with languages like C++ or Visual Basic to turn your creativity into innovation. With mashups, much the same way blogging systems put Web publishing into the hands of millions of ordinary non-technical people, the barrier to developing applications and turning creativity into innovation is so low that there's a vacuum into which an entire new class of developers will be sucked."16
  • The ability to "clone" other users' mashups is especially exciting: a newcomer does not need to spend time learning how to structure the data flows but can simply copy an existing framework that looks useful and then make minor modifications to customize the result.19
    • Barbara Lindsey
       
      This is the idea behind the MIT repository--remixing content to suit local needs.
  • As with content remixing, open access to materials is not just a matter of some charitable impulse to share knowledge with the world; it is a core requirement for participating in some of the most exciting and innovative activity on the Web.
  • "My Maps" functionality
  • For those still wondering what the value proposition is for offering an open API, Google's development process offers a compelling example of the potential rewards.
    • Barbara Lindsey
       
      Wikinomics
  • Elsewhere, it is difficult to point to significant activity suggesting that the mashup ethos is taking hold in academia the way it is on the wider Web.
  • Yet for the most part, the notion of the data mashup and the required openness is not even a consideration in discussions of technology strategy in higher educational institutions. "Data integration" across campus systems is something that is handled by highly skilled professionals at highly skilled prices.
  • Revealing how a more adventurous and inclusive online development strategy might look on campus, Raymond Yee recently posted a comprehensive proposal for his university (UC Berkeley), in which he outlined a "technology platform" not unlike the one employed by Amazon.com (http://aws.amazon.com/)—resources and access that would be invaluable for the institution's programmers as well as for outside interests to build complementary services.
  • All too often, college and university administrators react to this type of innovation with suspicion and outright hostility rather than cooperation.
  • those of us in higher education who observe the successful practices in the wider Web world have an obligation to consider and discuss how we might apply these lessons in our own contexts. We might ask if the content we presently lock down could be made public with a license specifying reasonable terms for reuse. When choosing a content management system, we might consider how well it supports RSS syndication. In an excellent article in the March/April 2007 issue of EDUCAUSE Review, Joanne Berg, Lori Berquam, and Kathy Christoph listed a number of campus activities that could benefit from engaging social networking technologies.26
  • What might happen if we allow our campus innovators to integrate their practices in these areas in the same way that social networking application developers are already integrating theirs? What is the mission-critical data we cannot expose, and what can we expose with minimal risk? And if the notion of making data public seems too radical a step, can APIs be exposed to selected audiences, such as on-campus developers or consortia partners?
Celeste Arrieta

The Web IS the Platform | Stick in the Sand - 1 views

  • The idea that students are digital natives is a myth
  • they need to be taught to see the web as a learning tool
  • helping student and teachers make sense out of an ever-growing, ever-diversifying web.
  •  
    how to sort web tools by function-research, production, publication, discussion and management-to create a simple, solid framework for helping student and teachers make sense out of an ever-growing, ever-diversifying web.
  • ...1 more comment...
  •  
    The idea that students are digital natives is a myth they need to be taught to see the web as a learning tool
  •  
    This is a very good resource. Thanks, Barbara!
  •  
    Glad you find our Diigo group useful, Celeste! Hope all is well with you.
Barbara Lindsey

Learning Spaces | EDUCAUSE - 0 views

  • Net Gen students are facile at multitasking
    • Barbara Lindsey
       
      The research shows that no one can multitask effectively... See John Medina and Brain Rules, for example.
  • Workers anticipated having a single profession for the duration of their working lives. Education was based on a factory-like, "one size fits all" model. Talent was developed by weeding out those who could not do well in a monochromatic learning environment.
    • Barbara Lindsey
       
      Also part and parcel of hegemonic educational practices which served to reinforce the existing social and economic paradigm.
  • Knowing now means using a well-organized set of facts to find new information and to solve novel problems. In 1900, learning consisted largely of memorization; today it relies chiefly on understanding.
  • ...25 more annotations...
  • learners construct knowledge by understanding new information building on their current understanding and expertise. Constructivism contradicts the idea that learning is the transmission of content to a passive receiver. Instead, it views learning as an active process, always based on the learner's current understanding or intellectual paradigm. Knowledge is constructed by assimilating new information into the learner's knowledge paradigm. A learner does not come to a classroom or a course Web site with a mind that is a tabula rasa, a blank slate. Each learner arrives at a learning "site" with some preexisting level of understanding.
  • Learning science research also highlights the importance of learner engagement, or as the American Psychological Association describes it, intentional learning.1 This means that learners must have a "metaperspective" from which to view and assess their own learning, which is often referred to as metacognition.2 An active learning environment provides the opportunity to assess one's own learning, enabling learners to make decisions about the course, as well as reflect on and assess their progress. In the past, the measure of learning was the final grade (a summative measure). But a final grade is merely a measure of the student's performance on tests. It does not measure the learning that did—or did not—take place. To encourage learning, summative testing or assessments must be combined with formative assessments. Formative assessment is not directly associated with the final grade; it helps learners understand their learning and make decisions about next steps based on that understanding.
  • research indicating that learning is encouraged when it includes social components such as debate or direct engagement with peers and experts. Learning is strengthened through social interactions, interpersonal relations, and communication with others.
  • Research indicates that learners need to be active with respect to their own learning process and assessment. Net Gen students' goal and achievement orientation comes into play here: that achievement focus can be directed toward quizzes and exercises that assist learners in evaluating their progress toward learning goals.
  • Obviously not all forms of learning must be social or team-based. In a variety of learning contexts, individual work is important. It may well be that Net Gen students' strengths are also their weaknesses. The expectation for fast-paced, rapidly shifting interaction coupled with a relatively short attention span may be counterproductive in many learning contexts. Repetition and steady, patient practice—key to some forms of mastery—may prove difficult for Net Gen students. Designing courses for them necessitates balancing these strengths and weaknesses.
  • We should not neglect the informal for the formal, or assume that Net Gen students somehow will figure out the virtual space on their own. We should connect what happens in the classroom with what happens in informal and virtual spaces.
  • Simply installing wireless access points and fresh carpeting isn't enough if done in isolation; such improvements pay real dividends only if they are in concert with the institution's overall teaching and learning objectives. It is the vision that generates the design principles that will, in turn, be used to make key decisions about how learning spaces are configured.
  • The vision and design principles should emphasize the options students have as active participants in the learning process. Design principles should include terms such as analyze, create, criticize, debate, present, and classify—all directed at what the space enables the students to do. For example, students should be able to present materials to the class. Outside class, they should have access to applications and materials that directly support analysis of data, text, and other media. Forums for discussion and critical debate, both real and virtual, are key to encouraging learning and will be looked for by Net Gen students.
  • Learning spaces should accommodate the use of as many kinds of materials as possible and enable the display of and access to those materials by all participants. Learning space needs to provide the participants—instructors and students alike—with interactive tools that enable exploration, probing, and examination. This might include a robust set of applications installed on the computer that controls the room's displays, as well as a set of communication tools. Since the process of examination and debate leads to discovery and the construction of new knowledge, it could be important to equip spaces with devices that can capture classroom discussion and debate, which can be distributed to all participants for future reference and study.
  • the end of the class meeting marks a transition from one learning mode to another.
  • This lecture hall is of relatively recent vintage; its seats and paired tables make it much easier to deploy and use her "tools," which include printouts of the day's reading, as well as a small laptop computer. Her fellow students are doing likewise. Each of them is using some device to access the course's Web site—some with laptops, others with tablet computers, still others with handheld computers. Using wireless connections, they all access the course's Web site and navigate to the site's "voting" page.
  • a "magic wand," a radio-frequency controller that enables her to operate her computer—as well as many of the classroom's functions—wirelessly, from any point in the room. She can capture anything she writes on the blackboard and make it available to her students on the course Web site. Freed from needing to take extensive notes, the students are able to participate more fully in the class discussion. Finally, the professor is carrying a small recorder that captures her lecture, digitizes the audio, and uploads it to the course Web site for the students to review when they prepare for finals.
  • Sandra launches the classroom's screen sharing application. Within a few seconds, her computer's screen is projected on the room's main screen. The class discussion focuses on this diagram, and the professor, using a virtual pencil, is able to make notes on the diagram. The diagram and notes are captured and placed on the class Web site for review.
  • Soon the debate gets stuck; the students can't resolve the issue. The professor goes to the podium, types briefly, and then asks the students to go to a URL to see a question and to choose the answer they feel is correct. The students access the Web page from laptops, handhelds, or wireless IP-based phones. In two minutes they have completed the poll and submitted their responses. The results are quickly tabulated and displayed. The wide diversity of opinion surprises everyone. The professor reframes the issue, without giving the answer, and the students continue to discuss it. She repeats the poll; this time there is more agreement among the students, enabling her to move the discussion forward.
    • Barbara Lindsey
       
      Could you see being able to do this? Would this work for you?
  • She goes to the podium computer and clicks on a few links, and soon a videoconferencing session is displayed on the right-hand screen. She has arranged to have a colleague of hers "drop in" on the class to discuss a point that is in the colleague's particular area of expertise. The class has a conversation with the expert, who is at large research institution more than 500 miles away. Students listen to the expert's comments and are able to pose questions using one of the three cordless microphones available to the class. On the left-hand screen, the visiting professor shows some images and charts that help explain the concepts under discussion.
  • the other students in her class have signed up for most of the slots, conferring with friends using chat programs to ensure that they sign up for the same lab slots.
  • The discussion pocket is the college's term for a small, curved space with a table and bench to accommodate a meeting of four or five people. Found outside the newer classrooms, they are handy for informal, spontaneous discussions. Sandra's group moves into the pocket and for the next 15 minutes continue their "spill over" discussion of the class.
    • Barbara Lindsey
       
      How does this change perceptions of when and where learning begins and ends?
  • hey are able to have an audio chat; Sandra's friend is in her dorm room, and Sandra is in a remote corner of the library where conversation will not disturb others. As their discussion progresses, they go to the course's Web site and launch the virtual whiteboard to diagram some concepts. They develop a conceptual diagram—drawing, erasing, and revising it until they agree the diagram is correct. They both download a copy. Sandra volunteers to work on polishing the diagram and will leave a copy of the final diagram in her share folder in her online portfolio "locker
  • The underlying theme remains the same, however: cultivating learning practices consistent with learning theory and aligned with the habits and expectations of Net Gen students
  • For most higher education institutions, the lecture hall will not disappear; the challenge is to develop a new generation of lecture hall, one that enables Net Gen students and faculty to engage in enlivened, more interactive experiences. If the lecture hall is integrated with other spaces—physically as well as virtually—it will enable participants to sustain the momentum from the class session into other learning contexts. The goal is not to do away with the traditional classroom, but rather to reinvent and to integrate it with the other learning spaces, moving toward a single learning environment.
  • Learning theory is central to any consideration of learning spaces; colleges and universities cannot afford to invest in "fads" tailored to the Net Gen student that might not meet the needs of the next generation.
  • For example, start with the Net Gen students' focus on goals and achievement. That achievement orientation ties to learning theory's emphasis on metacognition, where learners assess their progress and make active decisions to achieve learning goals. Learning space design could support this by providing contact with people who can provide feedback: tutors, consultants, and faculty. This could, in turn, be supported in the IT environment by making formative self-tests available, as well as an online portfolio, which would afford students the opportunity to assess their overall academic progress.
  • As institutions create an anywhere, anytime IT infrastructure, opportunities arise to tear down silos and replace them with a more ubiquitous learning environment. Using laptops and other networked devices, students and faculty are increasingly able to carry their entire working environment with them. To capitalize on this, campus organizations must work collaboratively to create a more integrated work environment for the students and faculty, one that better serves the mobile Net Gen students as well as a faculty faced with the initial influx of these students into their ranks
  • One of the key variables is the institution itself. Learning spaces are institutional in scope—their implementation involves the institution's culture, tradition, and mission.
  • The starting point for rethinking learning spaces to support Net Gen students begins with an underlying vision for the learning activities these spaces should support. This vision should be informed by learning theory, as well as by recognition of the characteristics of the students and faculty who use these spaces.
Barbara Lindsey

The New Gold Mine: Your Personal Information & Tracking Data Online - WSJ.com - 0 views

  • the tracking of consumers has grown both far more pervasive and far more intrusive than is realized by all but a handful of people in the vanguard of the industry. • The study found that the nation's 50 top websites on average installed 64 pieces of tracking technology onto the computers of visitors, usually with no warning. A dozen sites each installed more than a hundred. The nonprofit Wikipedia installed none.
  • the Journal found new tools that scan in real time what people are doing on a Web page, then instantly assess location, income, shopping interests and even medical conditions. Some tools surreptitiously re-spawn themselves even after users try to delete them. • These profiles of individuals, constantly refreshed, are bought and sold on stock-market-like exchanges that have sprung up in the past 18 months.
  • Advertisers once primarily bought ads on specific Web pages—a car ad on a car site. Now, advertisers are paying a premium to follow people around the Internet, wherever they go, with highly specific marketing messages.
  • ...22 more annotations...
  • "It is a sea change in the way the industry works," says Omar Tawakol, CEO of BlueKai. "Advertisers want to buy access to people, not Web pages."
  • The Journal found that Microsoft Corp.'s popular Web portal, MSN.com, planted a tracking file packed with data: It had a prediction of a surfer's age, ZIP Code and gender, plus a code containing estimates of income, marital status, presence of children and home ownership, according to the tracking company that created the file, Targus Information Corp.
  • Tracking is done by tiny files and programs known as "cookies," "Flash cookies" and "beacons." They are placed on a computer when a user visits a website. U.S. courts have ruled that it is legal to deploy the simplest type, cookies, just as someone using a telephone might allow a friend to listen in on a conversation. Courts haven't ruled on the more complex trackers.
  • tracking companies sometimes hide their files within free software offered to websites, or hide them within other tracking files or ads. When this happens, websites aren't always aware that they're installing the files on visitors' computers.
  • Often staffed by "quants," or math gurus with expertise in quantitative analysis, some tracking companies use probability algorithms to try to pair what they know about a person's online behavior with data from offline sources about household income, geography and education, among other things. The goal is to make sophisticated assumptions in real time—plans for a summer vacation, the likelihood of repaying a loan—and sell those conclusions.
  • Consumer tracking is the foundation of an online advertising economy that racked up $23 billion in ad spending last year. Tracking activity is exploding. Researchers at AT&T Labs and Worcester Polytechnic Institute last fall found tracking technology on 80% of 1,000 popular sites, up from 40% of those sites in 2005.
  • The Journal found tracking files that collect sensitive health and financial data. On Encyclopaedia Britannica Inc.'s dictionary website Merriam-Webster.com, one tracking file from Healthline Networks Inc., an ad network, scans the page a user is viewing and targets ads related to what it sees there.
    • Barbara Lindsey
       
      Tracking you an targeting ads to you on a popular dictionary site!
  • Beacons, also known as "Web bugs" and "pixels," are small pieces of software that run on a Web page. They can track what a user is doing on the page, including what is being typed or where the mouse is moving.
  • The majority of sites examined by the Journal placed at least seven beacons from outside companies. Dictionary.com had the most, 41, including several from companies that track health conditions and one that says it can target consumers by dozens of factors, including zip code and race.
  • After the Journal contacted the company, it cut the number of networks it uses and beefed up its privacy policy to more fully disclose its practices.
  • Flash cookies can also be used by data collectors to re-install regular cookies that a user has deleted. This can circumvent a user's attempt to avoid being tracked online. Adobe condemns the practice.
  • Most sites examined by the Journal installed no Flash cookies. Comcast.net installed 55.
  • Wittingly or not, people pay a price in reduced privacy for the information and services they receive online. Dictionary.com, the site with the most tracking files, is a case study.
  • Think about how these technologies and the associated analytics can be used in other industries and social settings (e.g. education) for real beneficial impacts. This is nothing new for the web, the now that it has matured, it can be a positive game-changer.
  • Media6Degrees Inc., whose technology was found on three sites by the Journal, is pitching banks to use its data to size up consumers based on their social connections. The idea is that the creditworthy tend to hang out with the creditworthy, and deadbeats with deadbeats.
  • "There are applications of this technology that can be very powerful," says Tom Phillips, CEO of Media6Degrees. "Who knows how far we'd take it?"
  • Hidden inside Ashley Hayes-Beaty's computer, a tiny file helps gather personal details about her, all to be put up for sale for a tenth of a penny.
  • "We can segment it all the way down to one person," says Eric Porres, Lotame's chief marketing officer.
  • One of the fastest-growing businesses on the Internet, a Wall Street Journal investigation has found, is the business of spying on Internet users.
  • Yahoo Inc.'s ad network,
  • "Every time I go on the Internet," she says, she sees weight-loss ads. "I'm self-conscious about my weight," says Ms. Reid, whose father asked that her hometown not be given. "I try not to think about it…. Then [the ads] make me start thinking about it."
  • Information about people's moment-to-moment thoughts and actions, as revealed by their online activity, can change hands quickly. Within seconds of visiting eBay.com or Expedia.com, information detailing a Web surfer's activity there is likely to be auctioned on the data exchange run by BlueKai, the Seattle startup.
  •  
    a New York company that uses sophisticated software called a "beacon" to capture what people are typing on a website
Barbara Lindsey

Web 2.0: beyond the buzz words | 4 Jun 2007 | ComputerWeekly.com - 0 views

  • Lee Bryant, one of the founders of Headshift, says the network effect is the difference. Traditional applications, such as groupware, became slower the more people used them, he says. With Web 2.0 applications the reverse is true: the more people use them, the more effective they become.
  • “You influence each other, so that if you use a social tagging system, for example, themes start to emerge and other people pick up on them and you get these positive feedback loops. It is that difference that leads to the network effect.”
  • These technologies are mostly just HTML and Javascript web pages designed to offer a more streamlined user experience, sitting atop a relational data layer used to feed back user-contributed data in new ways.
  • ...3 more annotations...
  • “We suddenly have enough bandwidth, memory and computing power around these net-centric platforms,” he says. This means that the “people-to-people” concept that Web 1.0 wanted to accomplish can be supported, but with software interfaces that make it easier to contribute.
  • Seely Brown’s project-by-project approach is well-advised. “Start by putting together a decent collection of RSS feeds relevant to your project,” says Bryant. Then, enabling the posting and sharing of bookmarks will help glean knowledge from the project team. Complementing this with blogs will enable people to spend more time on those elements from the bookmarks and feeds that are particularly relevant and need further articulation.
  • Understanding the difference between consuming newsfeeds and consuming e-mail demonstrates a wider cultural shift that needs to take place in Web 2.0-savvy organisations. Generally, e-mails demand focused attention. They are processed in sequence and each takes a couple of minutes (or more) from your day. Handling newsfeeds and blog posts in that way would make you unproductive, says Bryant. They require a “river of news” approach, in which workers skim large amounts of information for helpful nuggets. Social tagging helps to naturally elevate certain topics above others by making them more popular.Finally, a wiki will help escalate blog discussion to more collaborative working, as needed. This has certainly been Ward’s experience: “The way the sites tend to work is that the blog is where people have a dialogue, but if it moves into more detailed work, it moves into the wiki,” she says.
Barbara Lindsey

Print: The Chronicle: 6/15/2007: The New Metrics of Scholarly Authority - 0 views

    • Barbara Lindsey
       
      Higher ed slow to respond.
  • Web 2.0 is all about responding to abundance, which is a shift of profound significance.
  • Chefs simply couldn't exist in a world of universal scarcity
  • ...33 more annotations...
  • a time when scholarship, and how we make it available, will be affected by information abundance just as powerfully as food preparation has been.
  • Scholarly communication before the Internet required the intermediation of publishers. The costliness of publishing became an invisible constraint that drove nearly all of our decisions. It became the scholar's job to be a selector and interpreter of difficult-to-find primary and secondary sources; it was the scholarly publisher's job to identify the best scholars with the best perspective and the best access to scarce resources.
    • Barbara Lindsey
       
      Comments?
  • Online scholarly publishing in Web 1.0 mimicked those fundamental conceptions. The presumption was that information scarcity still ruled. Most content was closed to nonsubscribers; exceedingly high subscription costs for specialty journals were retained; libraries continued to be the primary market; and the "authoritative" version was untouched by comments from the uninitiated. Authority was measured in the same way it was in the scarcity world of paper: by number of citations to or quotations from a book or article, the quality of journals in which an article was published, the institutional affiliation of the author, etc.
  • Google
  • Google
    • Barbara Lindsey
       
      Where critical analysis comes in
  • The challenge for all those sites pertains to abundance:
  • Such systems have not been framed to confer authority, but as they devise means to deal with predators, scum, and weirdos wanting to be a "friend," they are likely to expand into "trust," or "value," or "vouching for my friend" metrics — something close to authority — in the coming years.
  • ecently some more "authoritative" editors have been given authority to override whining ax grinders.
  • In many respects Boing Boing is an old-school edited resource. It doesn't incorporate feedback or comments, but rather is a publication constructed by five editor-writers
  • As the online environment matures, most social spaces in many disciplines will have their own "boingboings."
  • They differ from current models mostly by their feasible computability in a digital environment where all elements can be weighted and measured, and where digital interconnections provide computable context.
  • In the very near future, if we're talking about a universe of hundreds of billions of documents, there will routinely be thousands, if not tens of thousands, if not hundreds of thousands, of documents that are very similar to any new document published on the Web. If you are writing a scholarly article about the trope of smallpox in Shakespearean drama, how do you ensure you'll be read? By competing in computability. Encourage your friends and colleagues to link to your online document. Encourage online back-and-forth with interested readers. Encourage free access to much or all of your scholarly work. Record and digitally archive all your scholarly activities. Recognize others' works via links, quotes, and other online tips of the hat. Take advantage of institutional repositories, as well as open-access publishers. The list could go on.
  • the new authority metrics, instead of relying on scholarly publishers to establish the importance of material for them.
  • They need to play a role in deciding not just what material will be made available online, but also how the public will be allowed to interact with the material. That requires a whole new mind-set.
  • cholarly publishers
  • Many of the values of scholarship are not well served yet by the Web: contemplation, abstract synthesis, construction of argument.
  • Traditional models of authority will probably hold sway in the scholarly arena for 10 to 15 years, while we work out the ways in which scholarly engagement and significance can be measured in new kinds of participatory spaces.
  • if scholarly output is locked away behind fire walls, or on hard drives, or in print only, it risks becoming invisible to the automated Web crawlers, indexers, and authority-interpreters that are being developed. Scholarly invisibility is rarely the path to scholarly authority.
  • Web 1.0,
  • garbed new business and publishing models in 20th-century clothes.
  • fundamental presumption is one of endless information abundance.
  • Flickr, YouTube
  • micromarkets
  • multiple demographics
  • Abundance leads to immediate context and fact checking, which changes the "authority market" substantially. The ability to participate in most online experiencesvia comments, votes, or ratingsis now presumed, and when it's not available, it's missed.
  • Google interprets a link from Page A to Page B as a vote, by Page A, for Page B. But, Google looks at more than the sheer volume of votes, or links a page receives; for example, it also analyzes the page that casts the vote. Votes cast by pages that are themselves 'important' weigh more heavily and help to make other pages 'important,'"
  • It has its limits, but it also both confers and confirms authority because people tend to point to authoritative sources to bolster their own work.
  • That kind of democratization of authority is nearly unique to wikis that are group edited, since not observation, but active participation in improvement, is the authority metric.
  • user-generated authority, many of which are based on algorithmic analysis of participatory engagement. The emphasis in such models is often not on finding scarce value, but on weeding abundance
  • Authority 3.0 will probably include (the list is long, which itself is a sign of how sophisticated our new authority makers will have to be): Prestige of the publisher (if any). Prestige of peer prereviewers (if any). Prestige of commenters and other participants. Percentage of a document quoted in other documents. Raw links to the document. Valued links, in which the values of the linker and all his or her other links are also considered. Obvious attention: discussions in blogspace, comments in posts, reclarification, and continued discussion. Nature of the language in comments: positive, negative, interconnective, expanded, clarified, reinterpreted. Quality of the context: What else is on the site that holds the document, and what's its authority status? Percentage of phrases that are valued by a disciplinary community. Quality of author's institutional affiliation(s). Significance of author's other work. Amount of author's participation in other valued projects, as commenter, editor, etc. Reference network: the significance rating of all the texts the author has touched, viewed, read. Length of time a document has existed. Inclusion of a document in lists of "best of," in syllabi, indexes, and other human-selected distillations. Types of tags assigned to it, the terms used, the authority of the taggers, the authority of the tagging system.
  • Most technophile thinkers out there believe that Web 3.0 will be driven by artificial intelligences — automated computer-assisted systems that can make reasonable decisions on their own, to preselect, precluster, and prepare material based on established metrics, while also attending very closely to the user's individual actions, desires, and historic interests, and adapting to them.
  •  
    When the system of scholarly communications was dependent on the physical movement of information goods, we did business in an era of information scarcity. As we become dependent on the digital movement of information goods, we find ourselves entering an era of information abundance. In the process, we are witnessing a radical shift in how we establish authority, significance, and even scholarly validity. That has major implications for, in particular, the humanities and social sciences.
Barbara Lindsey

Web 2.0 ERC | Simplifying Web 2.0 Education - 0 views

  •  
    A European Union-funded project to help educators learn about Web 2.0 and its use for learning.
Barbara Lindsey

Web 2.0 wanted by kids but not teachers | 5 Sep 2008 | ComputerWeekly.com - 0 views

  • Parents that understand technology see the value of Web 2.0 in the classroom, but teachers are less certain according to research.
  • Security and a lack of understanding are the major obstacles for teachers accepting Web 2.0, said the report.
  • In contrast two thirds of parents questioned said Web 2.0 is a positive addition to the classroom. And children themselves are already using the technologies.
Barbara Lindsey

Shirky: A Group Is Its Own Worst Enemy - 1 views

  • April 24, 2003
  • I want to talk about a pattern I've seen over and over again in social software that supports large and long-lived groups.
  • definition of social software
  • ...59 more annotations...
  • It's software that supports group interaction
  • how radical that pattern is. The Internet supports lots of communications patterns, principally point-to-point and two-way, one-to-many outbound, and many-to-many two-way.
  • Prior to the Internet, the last technology that had any real effect on the way people sat down and talked together was the table.
  • We've had social software for 40 years at most, dated from the Plato BBS system, and we've only had 10 years or so of widespread availability, so we're just finding out what works. We're still learning how to make these kinds of things.
  • So email doesn't necessarily support social patterns, group patterns, although it can. Ditto a weblog. If I'm Glenn Reynolds, and I'm publishing something with Comments Off and reaching a million users a month, that's really broadcast.
  • If it's a cluster of half a dozen LiveJournal users, on the other hand, talking about their lives with one another, that's social. So, again, weblogs are not necessarily social, although they can support social patterns.
  • So there's this very complicated moment of a group coming together, where enough individuals, for whatever reason, sort of agree that something worthwhile is happening, and the decision they make at that moment is: This is good and must be protected. And at that moment, even if it's subconscious, you start getting group effects. And the effects that we've seen come up over and over and over again in online communities.
  • You are at a party, and you get bored. You say "This isn't doing it for me anymore. I'd rather be someplace else.
  • The party fails to meet some threshold of interest. And then a really remarkable thing happens: You don't leave.
  • That kind of social stickiness is what Bion is talking about.
  • Twenty minutes later, one person stands up and gets their coat, and what happens? Suddenly everyone is getting their coats on, all at the same time. Which means that everyone had decided that the party was not for them, and no one had done anything about it, until finally this triggering event let the air out of the group, and everyone kind of felt okay about leaving.
  • This effect is so steady it's sometimes called the paradox of groups.
  • what's less obvious is that there are no members without a group.
  • there are some very specific patterns that they're entering into to defeat the ostensible purpose of the group meeting together. And he detailed three patterns.
  • The first is sex talk,
  • second basic pattern
  • The identification and vilification of external enemies.
  • So even if someone isn't really your enemy, identifying them as an enemy can cause a pleasant sense of group cohesion. And groups often gravitate towards members who are the most paranoid and make them leaders, because those are the people who are best at identifying external enemies.
  • third pattern Bion identified: Religious veneration
  • The religious pattern is, essentially, we have nominated something that's beyond critique.
  • So these are human patterns that have shown up on the Internet, not because of the software, but because it's being used by humans. Bion has identified this possibility of groups sandbagging their sophisticated goals with these basic urges. And what he finally came to, in analyzing this tension, is that group structure is necessary. Robert's Rules of Order are necessary. Constitutions are necessary. Norms, rituals, laws, the whole list of ways that we say, out of the universe of possible behaviors, we're going to draw a relatively small circle around the acceptable ones.
  • He said the group structure is necessary to defend the group from itself. Group structure exists to keep a group on target, on track, on message, on charter, whatever. To keep a group focused on its own sophisticated goals and to keep a group from sliding into these basic patterns. Group structure defends the group from the action of its own members.
  • technical and social issues are deeply intertwined. There's no way to completely separate them.
  • Some of the users wanted the system to continue to exist and to provide a forum for discussion. And other of the users, the high school boys, either didn't care or were actively inimical. And the system provided no way for the former group to defend itself from the latter.
  • What matters is, a group designed this and then was unable, in the context they'd set up, partly a technical and partly a social context, to save it from this attack from within. And attack from within is what matters.
  • This pattern has happened over and over and over again. Someone built the system, they assumed certain user behaviors. The users came on and exhibited different behaviors. And the people running the system discovered to their horror that the technological and social issues could not in fact be decoupled.
  • nd the worst crisis is the first crisis, because it's not just "We need to have some rules." It's also "We need to have some rules for making some rules." And this is what we see over and over again in large and long-lived social software systems. Constitutions are a necessary component of large, long-lived, heterogenous groups.
  • As a group commits to its existence as a group, and begins to think that the group is good or important, the chance that they will begin to call for additional structure, in order to defend themselves from themselves, gets very, very high.
  • The downside of going for size and scale above all else is that the dense, interconnected pattern that drives group conversation and collaboration isn't supportable at any large scale. Less is different -- small groups of people can engage in kinds of interaction that large groups can't. And so we blew past that interesting scale of small groups. Larger than a dozen, smaller than a few hundred, where people can actually have these conversational forms that can't be supported when you're talking about tens of thousands or millions of users, at least in a single group.
  • So the first answer to Why Now? is simply "Because it's time." I can't tell you why it took as long for weblogs to happen as it did, except to say it had absolutely nothing to do with technology. We had every bit of technology we needed to do weblogs the day Mosaic launched the first forms-capable browser. Every single piece of it was right there. Instead, we got Geocities. Why did we get Geocities and not weblogs? We didn't know what we were doing.
  • It took a long time to figure out that people talking to one another, instead of simply uploading badly-scanned photos of their cats, would be a useful pattern. We got the weblog pattern in around '96 with Drudge. We got weblog platforms starting in '98. The thing really was taking off in 2000. By last year, everyone realized: Omigod, this thing is going mainstream, and it's going to change everything.
  • Why was there an eight-year gap between a forms-capable browser and the Pepys diaries? I don't know. It just takes a while for people to get used to these ideas. So, first of all, this is a revolution in part because it is a revolution. We've internalized the ideas and people are now working with them. Second, the things that people are now building are web-native.
  • A weblog is web-native. It's the web all the way in. A wiki is a web-native way of hosting collaboration. It's lightweight, it's loosely coupled, it's easy to extend, it's easy to break down. And it's not just the surface, like oh, you can just do things in a form. It assumes http is transport. It assumes markup in the coding. RSS is a web-native way of doing syndication. So we're taking all of these tools and we're extending them in a way that lets us build new things really quickly.
  • Third, in David Weinberger's felicitous phrase, we can now start to have a Small Pieces Loosely Joined pattern.
  • You can say, in the conference call or the chat: "Go over to the wiki and look at this."
  • It's just three little pieces of software laid next to each other and held together with a little bit of social glue. This is an incredibly powerful pattern. It's different from: Let's take the Lotus juggernaut and add a web front-end.
  • And finally, and this is the thing that I think is the real freakout, is ubiquity.
  • In many situations, all people have access to the network. And "all" is a different kind of amount than "most." "All" lets you start taking things for granted.
  • But for some groups of people -- students, people in high-tech offices, knowledge workers -- everyone they work with is online. Everyone they're friends with is online. Everyone in their family is online.
  • And this pattern of ubiquity lets you start taking this for granted.
  • There's a second kind of ubiquity, which is the kind we're enjoying here thanks to Wifi. If you assume whenever a group of people are gathered together, that they can be both face to face and online at the same time, you can start to do different kinds of things. I now don't run a meeting without either having a chat room or a wiki up and running. Three weeks ago I ran a meeting for the Library of Congress. We had a wiki, set up by Socialtext, to capture a large and very dense amount of technical information on long-term digital preservation.
  • The people who organized the meeting had never used a wiki before, and now the Library of Congress is talking as if they always had a wiki for their meetings, and are assuming it's going to be at the next meeting as well -- the wiki went from novel to normal in a couple of days.
  • It really quickly becomes an assumption that a group can do things like "Oh, I took my PowerPoint slides, I showed them, and then I dumped them into the wiki. So now you can get at them." It becomes a sort of shared repository for group memory. This is new. These kinds of ubiquity, both everyone is online, and everyone who's in a room can be online together at the same time, can lead to new patterns.
  • "What is required to make a large, long-lived online group successful?" and I think I can now answer with some confidence: "It depends."
  • The normal experience of social software is failure. If you go into Yahoo groups and you map out the subscriptions, it is, unsurprisingly, a power law. There's a small number of highly populated groups, a moderate number of moderately populated groups, and this long, flat tail of failure. And the failure is inevitably more than 50% of the total mailing lists in any category. So it's not like a cake recipe. There's nothing you can do to make it come out right every time.
  • Of the things you have to accept, the first is that you cannot completely separate technical and social issues.
  • So the group is real. It will exhibit emergent effects. It can't be ignored, and it can't be programmed, which means you have an ongoing issue. And the best pattern, or at least the pattern that's worked the most often, is to put into the hands of the group itself the responsibility for defining what value is, and defending that value, rather than trying to ascribe those things in the software upfront.
  • Members are different than users. A pattern will arise in which there is some group of users that cares more than average about the integrity and success of the group as a whole. And that becomes your core group, Art Kleiner's phrase for "the group within the group that matters most."
  • But in all successful online communities that I've looked at, a core group arises that cares about and gardens effectively. Gardens the environment, to keep it growing, to keep it healthy.
  • The core group has rights that trump individual rights in some situations
  • And absolute citizenship, with the idea that if you can log in, you are a citizen, is a harmful pattern, because it is the tyranny of the majority. So the core group needs ways to defend itself -- both in getting started and because of the effects I talked about earlier -- the core group needs to defend itself so that it can stay on its sophisticated goals and away from its basic instincts.
  • All groups of any integrity have a constitution. The constitution is always partly formal and partly informal. A
  • If you were going to build a piece of social software to support large and long-lived groups, what would you design for? The first thing you would design for is handles the user can invest in.
  • Second, you have to design a way for there to be members in good standing. Have to design some way in which good works get recognized. The minimal way is, posts appear with identity.
  • Three, you need barriers to participation.
  • It has to be hard to do at least some things on the system for some users, or the core group will not have the tools that they need to defend themselves.
  • The user of social software is the group, not the individual.
  • Reputation is not necessarily portable from one situation to another
  • If you want a good reputation system, just let me remember who you are. And if you do me a favor, I'll remember it. And I won't store it in the front of my brain, I'll store it here, in the back. I'll just get a good feeling next time I get email from you; I won't even remember why. And if you do me a disservice and I get email from you, my temples will start to throb, and I won't even remember why. If you give users a way of remembering one another, reputation will happen,
Barbara Lindsey

Reading in the Hyperconnected Information Era: Lessons from the Beijing Ticket Scam - 0 views

  •  
    Abstract: In this paper I argue that the kinds of literacy needed for making sense of information on websites is more nuanced and embedded in our everyday context that we are currently providing for learners.  The kinds of analysis of websites which allow the processing of information in context are presented.This is demonstrated by an analysis of a scam site, which sold non-existent tickets to the Beijing Olympics and a description of a phishing attempt at Twitter. The skills required to understand information presented on the web have evolved far quicker than the parallel shifts in road safety skills, and people are nowrequired to read web sites contextually if they are to be able to make informed decisions about information available on the World Wide Web.  It is proposed that this is achieved through education rather than filtering out undesirable information. 
Barbara Lindsey

The Souls of the Machine: Clay Shirky's Internet Revolution - The Chronicle Review - Th... - 0 views

  • He argues that as Web sites become more social, they will threaten the existence of all kinds of businesses and organizations, which might find themselves unnecessary once people can organize on their own with free online tools. Who needs an academic association, for instance, if a Facebook page, blog, and Internet mailing list can enable professionals to stay connected without paying dues? Who needs a record label, when musicians can distribute songs and reach out to fans on their own?
  • "More people can communicate more things to more people than has ever been possible in the past, and the size and speed of this increase, from under one million participants to over one billion in a generation, makes the change unprecedented."
  • in his latest book, Cognitive Surplus: Creativity and Generosity in a Connected Age, scheduled to appear from Penguin Press this month. In it, he urges companies and consumers to stop clinging to old models and embrace what he characterizes as "As Much Chaos as We Can Stand" in adopting new Web technologies. He presses programmers and entrepreneurs to throw out old assumptions and try as many crazy, interactive Web toys as they can—to see what works, just as the students here do.
  • ...9 more annotations...
  • He figures all of Wikipedia, his gold standard for group activity online, took about 100 million hours of thought to produce. So Americans could build 2,000 Wikipedia projects a year just by writing articles instead of watching television.
  • Those new activities—and he gives plenty of examples in the book of projects already under way—could center on charity, civic engagement, coping with diseases, and more.
  • He points out that in the several decades immediately following Gutenberg's first Bible, not much really changed in European information society. Much later, some world-changing ideas came along on how to use the printing press, like the Invisible College.
  • "The problem with alchemy wasn't that the alchemists had failed to turn lead into gold—no one could do that. The problem, rather, was that the alchemists had failed uninformatively."
  • "Even when working with the same tools, they were working in a far different, and better, culture of communication."
  • Today's open-source software and the hypersharing of social networks represent a new, better order. And we're only starting to see the impact of those inventions.
  • Essentially, says Danah Boyd, a researcher for Microsoft Research and a longtime friend, Shirky thinks Karl Marx got it wrong. While critics like Slee may read any online social participation as economic exploitation, Shirky argues that people are motivated by love, not money. She points to Wikipedia: "People contribute because they enjoy the process," she says. Or academe. "Are we doing it for the pay?" "There's a lot of labor of love. People like being a part of cultural production on every level."
  • Shirky got the job at NYU because of a talk he gave at a technology conference in the late 1990s, while he was working as a freelance computer programmer and Web designer. T
  • Drawn to the classroom, he approached Yale in 1995 about teaching a class there on online social groups. Though students there backed the idea, he says, a university committee turned him down. "They killed it because they said it doesn't really make sense to talk about community online because those people aren't really meeting each other," he says.
Barbara Lindsey

Digital Ethnography » Blog Archive » Getting Started with Web 2.0 - 0 views

  • Web 2.0 refers to new websites that are more dynamic, user-driven, and interlinked (and interlinked in new and interesting ways).
  • An RSS (Really Simple Syndication) feed is a way for news organizations, academic journals, book publishers, and virtually anybody who distributes information to distribute that information without any markup or formatting, so that your own browser or website can format it and make it look nice on your own page. You can add any RSS feed to a website like Netvibes. This allows you to have all of your favorite sites that are frequently updated viewable on one single page.
  •  
    Web 2.0 refers to new websites that are more dynamic, user-driven, and interlinked (and interlinked in new and interesting ways).
Barbara Lindsey

Minds on Fire: Open Education, the Long Tail, and Learning 2.0 (EDUCAUSE Review) | EDUC... - 0 views

  • But at the same time that the world has become flatter, it has also become “spikier”: the places that are globally competitive are those that have robust local ecosystems of resources supporting innovation and productiveness.2
  • various initiatives launched over the past few years have created a series of building blocks that could provide the means for transforming the ways in which we provide education and support learning. Much of this activity has been enabled and inspired by the growth and evolution of the Internet, which has created a global “platform” that has vastly expanded access to all sorts of resources, including formal and informal educational materials. The Internet has also fostered a new culture of sharing, one in which content is freely contributed and distributed with few restrictions or costs.
  • the most visible impact of the Internet on education to date has been the Open Educational Resources (OER) movement, which has provided free access to a wide range of courses and other educational materials to anyone who wants to use them. The movement began in 2001 when the William and Flora Hewlett and the Andrew W. Mellon foundations jointly funded MIT’s OpenCourseWare (OCW) initiative, which today provides open access to undergraduate- and graduate-level materials and modules from more than 1,700 courses (covering virtually all of MIT’s curriculum). MIT’s initiative has inspired hundreds of other colleges and universities in the United States and abroad to join the movement and contribute their own open educational resources.4 The Internet has also been used to provide students with direct access to high-quality (and therefore scarce and expensive) tools like telescopes, scanning electron microscopes, and supercomputer simulation models, allowing students to engage personally in research.
  • ...29 more annotations...
  • most profound impact of the Internet, an impact that has yet to be fully realized, is its ability to support and expand the various aspects of social learning. What do we mean by “social learning”? Perhaps the simplest way to explain this concept is to note that social learning is based on the premise that our understanding of content is socially constructed through conversations about that content and through grounded interactions, especially with others, around problems or actions. The focus is not so much on what we are learning but on how we are learning.5
  • This perspective shifts the focus of our attention from the content of a subject to the learning activities and human interactions around which that content is situated. This perspective also helps to explain the effectiveness of study groups. Students in these groups can ask questions to clarify areas of uncertainty or confusion, can improve their grasp of the material by hearing the answers to questions from fellow students, and perhaps most powerfully, can take on the role of teacher to help other group members benefit from their understanding (one of the best ways to learn something is, after all, to teach it to others).
  • This encourages the practice of what John Dewey called “productive inquiry”—that is, the process of seeking the knowledge when it is needed in order to carry out a particular situated task.
  • ecoming a trusted contributor to Wikipedia involves a process of legitimate peripheral participation that is similar to the process in open source software communities. Any reader can modify the text of an entry or contribute new entries. But only more experienced and more trusted individuals are invited to become “administrators” who have access to higher-level editing tools.8
  • by clicking on tabs that appear on every page, a user can easily review the history of any article as well as contributors’ ongoing discussion of and sometimes fierce debates around its content, which offer useful insights into the practices and standards of the community that is responsible for creating that entry in Wikipedia. (In some cases, Wikipedia articles start with initial contributions by passionate amateurs, followed by contributions from professional scholars/researchers who weigh in on the “final” versions. Here is where the contested part of the material becomes most usefully evident.) In this open environment, both the content and the process by which it is created are equally visible, thereby enabling a new kind of critical reading—almost a new form of literacy—that invites the reader to join in the consideration of what information is reliable and/or important.
  • Mastering a field of knowledge involves not only “learning about” the subject matter but also “learning to be” a full participant in the field. This involves acquiring the practices and the norms of established practitioners in that field or acculturating into a community of practice.
  • But viewing learning as the process of joining a community of practice reverses this pattern and allows new students to engage in “learning to be” even as they are mastering the content of a field.
  • Another interesting experiment in Second Life was the Harvard Law School and Harvard Extension School fall 2006 course called “CyberOne: Law in the Court of Public Opinion.” The course was offered at three levels of participation. First, students enrolled in Harvard Law School were able to attend the class in person. Second, non–law school students could enroll in the class through the Harvard Extension School and could attend lectures, participate in discussions, and interact with faculty members during their office hours within Second Life. And at the third level, any participant in Second Life could review the lectures and other course materials online at no cost. This experiment suggests one way that the social life of Internet-based virtual education can coexist with and extend traditional education.
  • Digital StudyHall (DSH), which is designed to improve education for students in schools in rural areas and urban slums in India. The project is described by its developers as “the educational equivalent of Netflix + YouTube + Kazaa.”11 Lectures from model teachers are recorded on video and are then physically distributed via DVD to schools that typically lack well-trained instructors (as well as Internet connections). While the lectures are being played on a monitor (which is often powered by a battery, since many participating schools also lack reliable electricity), a “mediator,” who could be a local teacher or simply a bright student, periodically pauses the video and encourages engagement among the students by asking questions or initiating discussions about the material they are watching.
  • John King, the associate provost of the University of Michigan
  • For the past few years, he points out, incoming students have been bringing along their online social networks, allowing them to stay in touch with their old friends and former classmates through tools like SMS, IM, Facebook, and MySpace. Through these continuing connections, the University of Michigan students can extend the discussions, debates, bull sessions, and study groups that naturally arise on campus to include their broader networks. Even though these extended connections were not developed to serve educational purposes, they amplify the impact that the university is having while also benefiting students on campus.14 If King is right, it makes sense for colleges and universities to consider how they can leverage these new connections through the variety of social software platforms that are being established for other reasons.
  • The project’s website includes reports of how students, under the guidance of professional astronomers, are using the Faulkes telescopes to make small but meaningful contributions to astronomy.
  • “This is not education in which people come in and lecture in a classroom. We’re helping students work with real data.”16
  • HOU invites students to request observations from professional observatories and provides them with image-processing software to visualize and analyze their data, encouraging interaction between the students and scientists
  • The site is intended to serve as “an open forum for worldwide discussions on the Decameron and related topics.” Both scholars and students are invited to submit their own contributions as well as to access the existing resources on the site. The site serves as an apprenticeship platform for students by allowing them to observe how scholars in the field argue with each other and also to publish their own contributions, which can be relatively small—an example of the “legitimate peripheral participation” that is characteristic of open source communities. This allows students to “learn to be,” in this instance by participating in the kind of rigorous argumentation that is generated around a particular form of deep scholarship. A community like this, in which students can acculturate into a particular scholarly practice, can be seen as a virtual “spike”: a highly specialized site that can serve as a global resource for its field.
  • I posted a list of links to all the student blogs and mentioned the list on my own blog. I also encouraged the students to start reading one another's writing. The difference in the writing that next week was startling. Each student wrote significantly more than they had previously. Each piece was more thoughtful. Students commented on each other's writing and interlinked their pieces to show related or contradicting thoughts. Then one of the student assignments was commented on and linked to from a very prominent blogger. Many people read the student blogs and subscribed to some of them. When these outside comments showed up, indicating that the students really were plugging into the international community's discourse, the quality of the writing improved again. The power of peer review had been brought to bear on the assignments.17
  • for any topic that a student is passionate about, there is likely to be an online niche community of practice of others who share that passion.
  • Finding and joining a community that ignites a student’s passion can set the stage for the student to acquire both deep knowledge about a subject (“learning about”) and the ability to participate in the practice of a field through productive inquiry and peer-based learning (“learning to be”). These communities are harbingers of the emergence of a new form of technology-enhanced learning—Learning 2.0—which goes beyond providing free access to traditional course materials and educational tools and creates a participatory architecture for supporting communities of learners.
  • We need to construct shared, distributed, reflective practicums in which experiences are collected, vetted, clustered, commented on, and tried out in new contexts.
  • An example of such a practicum is the online Teaching and Learning Commons (http://commons.carnegiefoundation.org/) launched earlier this year by the Carnegie Foundation for the Advancement of Teaching
  • The Commons is an open forum where instructors at all levels (and from around the world) can post their own examples and can participate in an ongoing conversation about effective teaching practices, as a means of supporting a process of “creating/using/re-mixing (or creating/sharing/using).”20
  • The original World Wide Web—the “Web 1.0” that emerged in the mid-1990s—vastly expanded access to information. The Open Educational Resources movement is an example of the impact that the Web 1.0 has had on education.
  • But the Web 2.0, which has emerged in just the past few years, is sparking an even more far-reaching revolution. Tools such as blogs, wikis, social networks, tagging systems, mashups, and content-sharing sites are examples of a new user-centric information infrastructure that emphasizes participation (e.g., creating, re-mixing) over presentation, that encourages focused conversation and short briefs (often written in a less technical, public vernacular) rather than traditional publication, and that facilitates innovative explorations, experimentations, and purposeful tinkerings that often form the basis of a situated understanding emerging from action, not passivity.
  • In the twentieth century, the dominant approach to education focused on helping students to build stocks of knowledge and cognitive skills that could be deployed later in appropriate situations. This approach to education worked well in a relatively stable, slowly changing world in which careers typically lasted a lifetime. But the twenty-first century is quite different.
  • We now need a new approach to learning—one characterized by a demand-pull rather than the traditional supply-push mode of building up an inventory of knowledge in students’ heads. Demand-pull learning shifts the focus to enabling participation in flows of action, where the focus is both on “learning to be” through enculturation into a practice as well as on collateral learning.
  • The demand-pull approach is based on providing students with access to rich (sometimes virtual) learning communities built around a practice. It is passion-based learning, motivated by the student either wanting to become a member of a particular community of practice or just wanting to learn about, make, or perform something. Often the learning that transpires is informal rather than formally conducted in a structured setting. Learning occurs in part through a form of reflective practicum, but in this case the reflection comes from being embedded in a community of practice that may be supported by both a physical and a virtual presence and by collaboration between newcomers and professional practitioners/scholars.
  • The building blocks provided by the OER movement, along with e-Science and e-Humanities and the resources of the Web 2.0, are creating the conditions for the emergence of new kinds of open participatory learning ecosystems23 that will support active, passion-based learning: Learning 2.0.
  • As a graduate student at UC-Berkeley in the late 1970s, Treisman worked on the poor performance of African-Americans and Latinos in undergraduate calculus classes. He discovered the problem was not these students’ lack of motivation or inadequate preparation but rather their approach to studying. In contrast to Asian students, who, Treisman found, naturally formed “academic communities” in which they studied and learned together, African-Americans tended to separate their academic and social lives and studied completely on their own. Treisman developed a program that engaged these students in workshop-style study groups in which they collaborated on solving particularly challenging calculus problems. The program was so successful that it was adopted by many other colleges. See Uri Treisman, “Studying Students Studying Calculus: A Look at the Lives of Minority Mathematics Students in College,” College Mathematics Journal, vol. 23, no. 5 (November 1992), pp. 362–72, http://math.sfsu.edu/hsu/workshops/treisman.html.
  • In the early 1970s, Stanford University Professor James Gibbons developed a similar technique, which he called Tutored Videotape Instruction (TVI). Like DSH, TVI was based on showing recorded classroom lectures to groups of students, accompanied by a “tutor” whose job was to stop the tape periodically and ask questions. Evaluations of TVI showed that students’ learning from TVI was as good as or better than in-classroom learning and that the weakest students academically learned more from participating in TVI instruction than from attending lectures in person. See J. F. Gibbons, W. R. Kincheloe, and S. K. Down, “Tutored Video-tape Instruction: A New Use of Electronics Media in Education,” Science, vol. 195 (1977), pp. 1136–49.
Barbara Lindsey

sigilt - Favorite Web 2.0 Tools - 0 views

  •  
    Description of web 2.0 tools selected by educators.
Barbara Lindsey

Stickis - 0 views

  •  
    Stickis lets you see what your friends are interested in around the web, and talk back too. Annotate web sites and share within your network.
Barbara Lindsey

web2storytelling - home - 0 views

  •  
    Bryan Alexander and Alan Levine's companion wiki to their article on Web 2.0 storytelling
1 - 20 of 307 Next › Last »
Showing 20 items per page