Skip to main content

Home/ Net 308/508 Internet Collaboration and Organisation S1 2012/ Group items tagged Net308_508

Rss Feed Group items tagged

Stephen R

Anatomy of an Anonymous Attack - 1 views

  •  
    This article, recently published by security firm Imperva, investigates how an Anonymous attack is mounted. A Particularly interesting point is that this article makes no mention of IRC channels, instead painting Facebook, Twitter and Youtube channels as the main methods of communication for Anonymous. Also interesting is that such communication is referred to as recruitment, recruitment of technically savvy hackers and not so technically savvy activists who are willing to participate in the attack. Particular attention should be paid to pages 6-8 which outline the recruitment activities over Facebook, Youtube and Twitter. Imperva outline the technical methods used to stage the attack, mentioning that there are 10 -15 'Anons' working to analyse the victim website for security vulnerabilities. These are more experienced hackers who are searching for a vulnerability that might allow them to retrieve private data from the victim (p.6). Although not mentioned in this article, perhaps these experienced hackers collaborate using Internet Relay Chat. When no vulnerability was found, Imperva notes that Anonymous instead tries a DDoS attack, but instead of employing the LOIC, a web based version is used for ease of participation (p.13). This way, users of any device can be recruited (through social media) into participate in the attack with minimum of barriers to entry. Although this article focusses heavily on the technical aspect of the attack, a significant portion of the article deals with the recruitment of participant through social media, alongside discussion of the online variant of the LOIC collaborative Denial of Service tool. Anatomy of an Anonymous Attack. 2012. Imperva. http://www.imperva.com/docs/HII_The_Anatomy_of_an_Anonymous_Attack.pdf
  •  
    This document may prove to be quite a significant additional reference to my focus of my chosen topic of the Anonymous movement and hacktivism. This article discusses precisely what Mansfield-Devine (2011) neglected to note; that within the Anonymous movement, there are a number of individuals with significant hacking skills who are able to retrieve valuable data from the targets of Anonymous attacks. The article quite thoroughly deconstructs the order in which Anonymous attacks typically occur, the differences between the two major types of individuals who participate, and circumstances under which Anonymous attacks are generally able to be successfully performed (2012). Of particular interest, is the emphasis placed on the importance of acknowledging the fact that Anonymous attacks are not always as harmless as they may appear. Another interesting note is found within the conclusion of the report. The report suggests that targeted, small-scale data retrieval attacks are the preferred means of attack for the Anonymous movement and that "DDoS is the hacker's last resort" (Anatomy of an Anonymous Attack, 2012). This would suggest that unlike many sources of information regarding Anonymous hacktivism attacks, Imperva has identified the serious nature of many incidents involving the Anonymous movement, which do not necessarily receive as much immediate attention as a simple DDoS attack may. Anatomy of an Anonymous Attack. (2012). Imperva. Retrieved from http://www.imperva.com/docs/HII_The_Anatomy_of_an_Anonymous_Attack.pdf Mansfield-Devine, S. (2011). Anonymous: Serious threat or mere annoyance? Network Security 1: 4-10. Retrieved from http://www.sciencedirect.com.dbgw.lis.curtin.edu.au/science/article/pii/S1353485811700046
michelangelo magasic

STEAL THIS FILM - 2 views

  •  
    Steal this Film is a documentary about bittorrent culture centred around the story of the Swedish torrent tracking website The Pirate Bay. In telling their story, the Pirate Bay members relate quite early on that they are not only a filesharing website but also an organisation for free speech. We see bittorrent organisations as situated within the wider context of media piracy and filesharing networks as clandestine organisations that must be diffuse in order to evade detection by anti p2p groups. The Pirate Bay's struggle against media outlets is elevated to a battle against American cultural hegemony. Within this context Kent's (2011) reading of the swarm as a simulacra of group identity can be seen as a defence - a tactic - as deCerteau (1984) puts it for the weak to re-appropriate the power of the strong. Filesharing is a form of protest. By publicising their struggle, The Pirate Bay build a bridge between physical and virtual communities. The film features spontaneous interviews with people on the street."The internet is too big, you can't fight it, (27mins)" says a girl with blue hair. Is she referring to the network of computers which make up the internet, or the strength of communities which practice filesharing, the linkages and solidarity of people across the world? This footage awakens the reader's conceptions of a link between physical and virtual activities, online collaboration breeds a solidarity between users which can echo beyond the activities of the swarm. We see bittorrent used not solely as a method for obtaining entertainment but as a vehicle for ideological struggle. The faces in the movie are conspicuously youthful and one sees that they collaborate not only in terms of files but also in ideas and viewpoints. We see bittorrent as a tool for worldwide collaboration/change. References Certeau, M. (1984), The Practice of Everyday Life. University of California Press, Berkeley. Kent M (2011), 'Strangers in the Sw
  • ...3 more comments...
  •  
    There is no escaping the debate about copyright when studying the Internet. This however is refreshing point of few surrounding the topic. The reliability of the source is sound as long as a viewer is wary of any bias as it is solely from the Pirate Bay point of view. There is a strong representation of a youth culture also. The youth appear tired of being force fed the institutionalized approach to media that had previously existed. As the interviewees comment, the raid on Pirate Bay was clearly a political power play and one that backfired. There is defiance towards America in particular as the documentary presents evidence of its attempt to pressure Sweden into sabotaging those who are 'threatening' Hollywood industries. Copyright laws do not translate across international boarders and for the first time, thanks to this documentary, I could actually see how this might play out in the real world. This is both valuable and useful in the overall understanding of the Bit torrent topic. Of particular importance to me was the statement made by one of the Piratbyran creators, Rasmus Fleischer, stated that they are 'our basic principle is not about building empires' (The League of Noble Peers, 2006). This is the most crucial difference between the Hollywood approach to copyright and the P2P approach to copyright. Just because media is made available for free consumption does not mean that it will not translate into sales on any level. I went away from this documentary feeling that industry producers and distributors need to get creative with their content, listen to their consumers and create a shared experience of shared benefit to both sides of the argument.
  •  
    This roughly thirty minute long documentary, while being a very "copy-left" focussed, helps to place BitTorrent within the context of global politics. It is about "ThePirateBay", one of the biggest BitTorrent trackers in history. ThePirateBay's servers are physically located in Sweden, and this documentary shows how Swedish law has interacted with American and international laws about copyright and file sharing. It uses various clips from many different interviews, including the people central to ThePirateBay but also Swedish citizens seemingly randomly interviewed on the street. It is interesting to note that many of them do seem to have some knowledge about ThePirateBay and also express their support for the site. This sense of community surrounding BitTorrent reminds me of the Australian youths in the "BitTorrents and Family Guy: teenage peer group interactions around a peer-to-peer Internet download community" paper. This documentary highlights the feeling of oppression and resistance to control of media which seems to underlie the communities who use BitTorrent. Combined with the copyright laws, these are worth thinking about because of how they influence the way people use BitTorrent to collaborate, and also how people collaborate to support file-sharing, including by demonstration as seen in the documentary.
  •  
    This film provides various aspects of online file sharing, particularly, in relation to music and movies. The topics discussed in the film include: the difference in copyright laws between America and Sweden, how online file sharing changed the nature of networking within society. The film also presented the contrast of perspectives of online file sharing held by younger consumers as opposed to those of the older producers. In America, major music and film industries regard peer-to-peer file sharing as an infringement to copyright, while in Sweden there is no copyright law for film and music productions that are available in bittorent. A Swedish user disputed that American copyright law should not intervene in other countries because there is no geographical limitation in the Internet. The age gap also highlighted different perspectives, for example, younger users believe in the right to public access while the older producers believe in that commodities (such as music and films) cannot be given to people for free. To argue this, the market of music and film industry cannot outlaw social change. Lastly, the activity of file sharing through bittorent has changed how the way society collaborates to exchange ideas and information. For example, the support to use bittorent is not documented in a fixed website but only transferred through online forums where users collaborate as social groups. This film relates well to the resources I had about Youtube in terms of different perspective based on age. Young people tend to use online media fluently and do not see copyright implications. The movements towards file sharing has become even more apparent, this is shown by social online collaboration is the current method to consume popular media, how the consumer recreate this media and contribute to the mass again.
  •  
    Steal this Film, is a short 30-minute documentary that looks at the social politics and debate about file sharing and the bit torrent client, focusing on Swedish torrent tracking website The Pirate Bay. The documentary outlines how file sharing and copyright is a touchy subject within American laws, and through the documentary we are able to hear differing opinions on who is right and who wrong. The various people that are interviewed who are involved with the Pirate Bay take a 'us against the world' approach and make it clear that technically they aren't doing anything wrong, and through the power of free speech they are making their voice heard. Numerous youths are also interviewed and each seem to be of the copyleft opinion that what they are doing is almost some sort of activism, and believe that these torrent communities are un-able to be stopped. I would also have to agree with this as a 'Pirate' myself and also through the learning that I have undertaken while at university, that this excuse by the Movie/Music industry that they aren't being hurt through piracy is totally utterly false and I think as one of the speakers in the video says "We aren't going to wake up one day and find that all music artists have died because of Piracy". In fact I would go as far to say that because of this cry-baby outlook by these industries that the bit torrent and file-sharing communities have been strengthened because of it.
  •  
    I was taken aback when I went to download 'Steal This Film' and it popped up as a torrent file in BitTorrent. I suppose I wasn't used to, what I perceived as, 'legitimate' content being provided in the form of a torrent. The film stated, "right now ten million people are using BitTorrent" and indeed, at the time of watching, I was also using BitTorrent. One of the things I found admirable, and also a little surprising, was the resilience of the Pirate Bay founders. Even after being raided and shut down by the authorities, their belief in what they were doing, and their advocacy of free speech, was too strong to just let go. I also found the film interesting in its depiction of the various anti piracy campaigns created by Hollywood film studios juxtaposed with the interviews of young people claiming that the amount of money made by Hollywood is "absurd". Even if crew members and writers are suffering at the hands of film piracy, like the people interviewed, I find it difficult to sympathise with Hollywood's view point when you can safely assume that the largest chunk of proceeds made from any film go to the 'talent' and not those people working so hard behind the scenes. Perhaps Hollywood losing money could be considered a positive outcome, as so many subpar films probably should never have been made in the first place. Perhaps having less money to fund any film on a whim will lead film studios to choose their projects more carefully, resulting in the delivery of quality rather than quantity to film consumers.
Stephen R

Anonymous: serious threat or mere annoyance? - 5 views

  •  
    Steve Mansfield-Devine, editor of Network Security, analyses the threat of the Anonymous activist hacking group. In doing so he discusses the collaborative tools used to organise the members of Anonymous into a focussed effort. The tools discussed include the Low Orbit Ion Cannnon (LOIC) and various spinoffs, Internet Relay Chat (IRC) and Twitter. Mansfield-Devine's discusses the Anonymous group's usage of the LOIC as a Distributed Denial of Service (DDoS) weapon. Mansfield-Devine makes a clear point that only with enough users is the LOIC effective, making the effective usage of the LOIC a collaborative operation. The more users collaborating with the tool, the more effective it becomes. Mansfield-Devine discusses how Anonymous members are coerced into participating in an LOIC attack, specifying IRC and Twitter as the main forms of mobilisation of members. His discussion highlights IRC as a primary form of organisation, with Twitter being taking a more secondary role in directing potential participants into IRC channels. Mansfield-Devine does note that Twitter became an integral part of Anonymous organisation when their domain names were taken offline by authorities during Anonymous operations. Tweets were sent out to redirect the Anonymous participants into new IRC chat rooms to continue the attack. Overall, this article concisely covers IRC, Twitter and LOIC based aspects of Anonymous collaboration and organisation. Mansfield-Devine, Steve. 2011. "Anonymous: Serious threat or mere annoyance?" Network Security 1: 4-10. http://dx.doi.org.dbgw.lis.curtin.edu.au/10.1016/S1353-4858(11)70004-6
  •  
    In this article Mansfield-Devine explores the threat of the organisation Anonymous and the collaborative tools they use to organise the group. In relation to this, he specifies that Anonymous uses "Low Orbit Ion Cannnon (LOIC) and various versions, Internet Relay Chat (IRC) and Twitter" as his key tools for facilitating organised attacks on institutions (Mansfield-Devine, 2011, p. 4). This article links to the article 'Kony 2012: The Template for Effective Crowdsourcing?' by Olubunmi Emenanjo, on more than one level, they are both about outside organisations against institutions, and they both undeniably rely on social media and the power of the crowds for the mobilization and facilitation of their actions and recruitment (Emenanjo, 2012). The success of the Kony 2012 campaign and Anonymous's attacks can be pin-pointed to how the organisations are aimed at a particular audience, reinforced by social media platforms, and most importantly how they harnessed networking tools to deliver their messages. However a major difference between the two groups is that the Kony 2012 organisation has a consistent online identity, while Anonymous has anonymity. Little is known about the organisation itself but the tools they utilise (LOIC, IRC, and Twitter) lead us so assume that their audiences engage with the organisation. References Emenanjo, O. (2012). Kony 2012: The Template for Effective Crowdsourcing? Communia. Retrieved from http://stipcommunia.wordpress.com/2012/03/13/kony-2012-the-template-for-effective-crowdsourcing/ Mansfield-Devine. (2011). Anonymous: Serious threat or mere annoyance?. Network Security, 1, 4-10. http://dx.doi.org.dbgw.lis.curtin.edu.au/10.1016/S13534858(11)70004-6
  •  
    Although much of this article is not particularly useful to my focus on Anonymous, this article still raises some interesting notes. The way in which the author plays down the impact of Anonymous' actions towards the end of this article is of particular interest. The author often refers to the disorganized nature of the Anonymous movement, and suggests frequently that although a number of individuals may be involved, automated 'botnets' are often more effective than Anonymous members (Mansfield-Devine, 2011). For my focus, this is the most important part of the article because of the way in which the author neglects to take note of Anonymous members who do more than simply use LOIC and other DDoS attacks. Although it may be true that Anonymous DDoS attacks may not result in significant, long term damage to their targets, the disruption caused by such attacks can often provide enough distraction for Anonymous hacktivists to retrieve data from said targets. With hacktivist groups within movements such as Anonymous being responsible for the largest amount of stolen data in 2011 (AFP, 2012), Anonymous DDoS attacks could pave the way for much more damage to be done to websites than the temporary service disruptions noted by the author of this article. Mansfield-Devine, S. (2011). Anonymous: Serious threat or mere annoyance? Network Security 1: 4-10. Retrieved from http://www.sciencedirect.com.dbgw.lis.curtin.edu.au/science/article/pii/S1353485811700046 AFP. (2012). 'Hacktivists' biggest data thieves in 2011: Verizon. Retrieved from http://au.news.yahoo.com/thewest/a/-/world/13242086/hacktivists-biggest-data-thieves-in-2011-verizon/
owen_davies

Influences on Cooperation in BitTorrent Communities - 16 views

Andrade, N., Mowbray, M., Lima, A., Wagner, G., & Ripeanu, M. (2005) Influences on Cooperation in BitTorrent Communities Retrieved from http://people.cs.uchicago.edu/~matei/PAPERS/p2pecon.05.pd...

Net308_508 technology Bit Torrent community collaboration Cooperation

started by owen_davies on 23 Mar 12 no follow-up yet
Jocelyn Workman

http://www.usip.org/files/resources/SR252%20-%20Crowdsourcing%20Crisis%20Information%20... - 1 views

  •  
    You Tube Need to Know | Crisis mappers: Mobile technology helps disaster victims worldwide | Uploaded by PBS . Retrieved 20 March 2012 http://www.youtube.com/​watch?v=xW7Vt5iunWE This YouTube presentation tells the story of how crisis mapping came to be a source of critical and timely support to Haitians requiring aid following the 2010 devastating earthquake. It is a remarkable example of resourcefulness, voluntary collaboration and use of social media to assist with the humanitarian aid response. The video includes a live interview with, Patrick Meier, head of Ushahidi, a not-for-profit organisation, who explains that within hours of the news of the quake reaching the world, he knew that it would be a real challenge to get information from people on the ground in Haiti. Based on the Haitians high mobile ownership (85%) he worked out that texting a message would be the best way to find out who needed help. He arranged for a local phone company to provide a number for emergency texts. The number is advertised on the radio as 90% of the population has radio access. A call was put out on Facebook to locate volunteers who could translate messages from Haitian Kreyol to English. These messages are then forwarded to Boston where a voluntary group of students plot the location on an online map. The online location is then forwarded to the US response group coordinating the distribution of aid. Within hours help is sent. I came across this video when sourcing materials and was impressed with the professional presentation, the inclusion of a Haitian recipients experience of receiving aid after texting the number he heard on the radio, and interviews with major stakeholders. Further searches of Patrick Meier verified the story. Crisis mapping was also used during the Libyan crisis to bring aid to victims. Crisis locations were extracted from posts for help on Facebook and Twitter and plotted by volunteers
  •  
    (My commentary is actually against the PDF that's linked to, rather than the YouTube video. Reference at the end). This report, commissioned by the United States Institute of Peace, examines the role of Ushahidi, a crisis-mapping platform, in the relief effort following the 2010 earthquake in Haiti. It highlights the ability of crowd-sourcing to provide a more reliable account of what's happening in a disaster situation than traditional intelligence gathering means which don't engage the local population. It begins by describing the challenge that rescuers faced when sourcing their intelligence from media reports, which tended to focus on isolated incidents of violence, wrongly spreading the idea that violence was commonplace and leading the rescue teams to delay their rescue efforts. The report accuses the media of deliberately producing exaggerated reports, which may be true, but even the most ethical journalist can only report on what he or she experiences - if he or she sees or hears about a violent incident, the resulting report will almost certainly give the impression of violence. For the most objective and detailed picture of the state of a crowd, the largest possible portion of that crowd needs to have a voice - something an individual journalist could never facilitate. That's where Ushahidi proved a valuable tool. By aggregating SMS messages, email and social media communications from those in distress, it allowed rescuers to direct assistance appropriately. In addition to crowd-sourcing the conditions of those in distress, Ushahidi also incorporated other forms of crowd-sourcing - maps were sourced from the World Bank, Yahoo!, GeoEye and the U.S. government to provide geographic information, and staffing power was provided by a vast team of volunteers. This gives the case study a lot of depth. Heinzelman, J. and Waters, C. (2010) Crowdsourcing Crisis Information in Disaster-Affected Haiti Retrieved 2 April 2012 from http://www.us
Tamlin Dobrich

Kony 2012: The Template for Effective Crowdsourcing? - 25 views

A very interesting article that I believe presents a good basic understanding of the topic however being a Wordpress blog I would argue that it may not be a perfectly reliable or an unbiased source...

Net308_508 collaboration Crowd social media kony 2012 crowdsouced interventions

Chin Sing Wong

Incentives in Bit Torrent Induce Free Riding - 27 views

As most of my readings are focusing on analyzing the operating status of BitTorrent world, in this task, I found more readings are relevant to the issue of free-riding. This paper again take notes ...

Net308_508 collaboration community BitTorrent

owen_davies

Communities build robustness in Bit Torrent - 7 views

Van Werkhoven, B. Communities build robustness in Bit Torrent (2010) Retrieved from http://www.pds.ewi.tudelft.nl/~epema/ASCIa9/2010/Papers/Werkhoven.pdf This particular article looks at the mecha...

Net308_508 collaboration BitTorrent community

started by owen_davies on 25 Mar 12 no follow-up yet
jessica_mann

How to cheat BitTorrent and why nobody does - 19 views

Personally I use BitTorrent for the following reasons: I find it incredibly simple to use, the program loads quickly and doesn't seem to use much hardware resource and it isn't laced with advertise...

Net308_508 collaboration community BitTorrent

Dean Strautins

911.gov: Community Response Grids - 7 views

Yes the One-To-Many communication that social networks facilitate could have benefits for an emergency call centre to decrease the need for the same incident being reported multiple times. Maybe al...

Net308_508 collaboration community social media

ianzed

Making the News: Movement Organisations, Media attention and the public agenda - 18 views

This article very loosely relates to my focus on Anonymous. Although not particularly relevant to my focus, it does provide a decent contextual setting for explaining why Anonymous receives so much...

Net308_508 collaboration community Crowd participatory

theresia sandjaja

Intellectual Property and the Cultures of BitTorrent Communities - 14 views

This reading emphasises that Intellect Property has now evolved from the issue of law, ethics and polities to the issue of culture. The shift of cultural model in consuming media online affects the...

Net308_508 collaboration BitTorrent

Tamlin Dobrich

The More, The Wikier - 4 views

  •  
    Ball, P. (2007, February 27). The more, the wikier. Nature: International weekly journal of Science. Retrieved from http://www.nature.com The More, The Wikier is an article published on Nature: International Weekly Journal of Science, which explores the secret behind the quality of Wikipedia entries when anyone, anywhere has the ability to write and edit content. The article looks at three groups of researchers who "claim to have untangled the process by which many Wikipedia entries achieve their impressive accuracy". Wikipedia is an organisation in which users collaborate their knowledge to create an encyclopedia of information. "The percentage of edits made by the Wikipedia 'élite' of administrators" is steadily declining and "Wikipedia is now dominated by users who are much more numerous than the elite but individually less active." "The wisdom of the crowds" principle suggests that the combined knowledge of a large and diverse group is superior to the knowledge of a few experts. Ball explains that content accuracy and quality of Wikipedia articles is related to a high number of edits by a large number of users. For example, articles that deal with very topical issues receive a higher level of attention from a large and diverse audience and therefore are of higher quality than articles that are not as topical and thus do not attract the same attention. The three research groups referenced in the article are: Dennis Wilkinson and Bernardo Huberman of Hewlett Packard's research laboratories who studied how a high number of edits by a large number of users create the 'best' Wikipedia articles, Aniket Kittur of the University of California, and co-workers who explored how the Wiki community has evolved from a small governing group to a democracy, and Ofer Arazy and colleagues at the University of Alberta who discuss the importance of this diversification of Wikipedia contributors to the overall success of its articles.
  • ...1 more comment...
  •  
    I found the article, The More, the Wikier, useful to the topic I am studying, which is Wikipedia and how James Surowiecki's 'the wisdom of crowds' theory (Surowiecki, 2004) relates to it. The research Philip Ball refers to, suggests that the best Wikipedia articles are those with a large number of edits by a large number of contributors (Ball, 2007, para. 2). This supports 'the wisdom of crowds' theory which basically rests on the idea that if more people are involved in a project, the results will be stronger (Surowiecki, 2004, p. 5). The article also states that, not only is it important to have a large number of contributors to achieve good results, the contributors should come from a wide range of demographics (Ball, 2007, para. 14). Roy Rosenzweig, the author of one of the resources I chose, Can History be Open Source? Wikipedia and the Future of the Past, and Farhad Manjoo, the author of Is Wikipedia a Victim of Its Own Success? another article that Tamlin Dobrich uploaded to this Diigo group, both support this claim also. Rosenzweig and Manjoo write about the bias in the types of Wikipedia contributors there are (the majority are white, English-speaking, educated, Western males) which contribute to some topics and views being missed (Rosenzweig, 2006, p. 128; Manjoo, 2009, para. 9). While this article does discuss some important points about Wikipedia and 'the wisdom of crowds' (Surowiecki, 2004) which are important to the topic I am studying, I think this resource would be more valuable if Ball had included more examples to support the statements he makes, in order to further bolster his arguments. References Ball, P. (2007, February 27). The More, the Wikier. Nature. doi: 10.1038/news070226-6 Manjoo, F. (2009, September 28). Is Wikipedia a Victim of Its Own Success?. Time. Retrieved from http://www.time.com/time/magazine/ar
  •  
    This article takes a look at the crowd sourcing idea that Wikipedia thrives on. 'Lots of edits by lots of people'. Crowd sourcing makes use of the knowledge of crowds. The more people you have contributing information to an article the more information the article will contain. This is however affected when fewer people begin to contribute to the writing and collaboration process. A person contributing to the Wikipedia page may only be making a change as small as a simple grammatical correction but it means quite a lot to the overall aesthetic of the page. People are far less likely to believe the information presented by an article filled with errors and punctuation problems. It might seem like a small issue but this is how many hands make light work. Wikipedia's reliability comes from its ability to be edited by many people with small alterations. It is strange however that in your other article regarding Wikipedia being its own worst enemy you have points made there of why Wikipedia is leaning towards extinction. These mainly are concerned with the decreasing number of people editing. So is Wikipedia going to stay strong or will it slowly become just another encyclopedia?
  •  
    Ball's article highlights the successful nature of Wikipedia's open source network and how quality of information is achieved. He suggests that the 'secret' to Wikipedia's credibility is the increasing number of contributors and the 'diversification' it brings to the platform through collective knowledge (Ball, 2007). I can relate Ball's article to Surowiecki's (2004) article Wisdom of the Crowds because it reinforces the notion that people must be unrelated, independent, and have diversity of mind from one another to form good opinions. The architecture of the collaborative platform Wikipedia harnesses the 'power of the crowds' in such a way that encourages diverse participation, as opposed to a group-think scenario, and thus produces 'wisdom' through quality information (Surowiecki, 2004, p5). Ball observes that Wikipedia's structure allows for an above average quality of information on more topical articles. This occurs because popular topics create more traffic, which in turn enables more contributors to edit an article and therefore creating more 'diverse' and 'reliable' information (Ball, 2007). This reinforces the quality of an article through diversification and mass collaboration. This notion of 'quality' can be applied to the Kony 2012 campaign page on Wikipedia (http://en.wikipedia.org/wiki/Kony_2012), which has been edited over 500 times and has been viewed 1,227,982 times since 6 March 2012, when the Kony 2012 campaign was first launched (Wikipedia Article Page Statistics, 2012). However, it is at this point that the similarities between Ball and Surowiecki cease. According to Ball, the Kony 2012 Wikipedia article is a prime example of a topical issue. The statistics reinforce his observations about Wikipedia's crowds and how they are able to create credible and reliable information due to diversification brought into the article by 1,227,98
Tamlin Dobrich

Is Wikipedia a Victim of Its Own Success? - 8 views

  •  
    Manjoo, F. (2009, September 28). Is Wikipedia a Victim of Its Own Success? Time Magazine. Retrieved from http://www.time.com/time/magazine Is Wikipedia a Victim of Its Own Success? is an article which suggests Wikipedia's achievement level has reached its peak and eventually will see its downfall. The article looks in depth at the potential causes for Wikipedia's slowing growth and how these elements could possibly lead to the community's eventual failure. It suggests one reason for Wikipedia's decelerating growth rate is simply that "the site has hit the natural limit of knowledge expansion" and the only possible remaining contributions are obscure topics and "janitorial" editing job such as formatting and fixing grammar. The article claims "Wikipedia's natural resource is emotion" and editors are motivated by the "rush of joy" they receive when contributing their unique wisdom to an audience of 300 million people. What this means is that as the need for significant edits diminishes, so too does participation enthusiasm. Additionally, as Wikipedia has grown, so too has the bureaucracy and complex laws of Wikipedia, resulting in a community that has become unwelcoming to novice Wikipedians. The article discusses how Wikipedia editors are made up of a narrow class of participants dominated by young males from wealthy countries and academic backgrounds. The Wikipedia author-base is not as broad and diverse as first thought and it seems "the encyclopedia is missing the voices of people in developing countries, women and experts in various specialties that have traditionally been divorced from tech". This too is given as a reason for Wikipedia's imminent downfall.
  • ...5 more comments...
  •  
    An interesting topic of diminishing contributors and a conclusion I had already theorised must be happening for the exact theories stated in the article. I think this article will be good to reflect on in future years. Maybe a future article will be on If You Do Not Innovate Then You Die. I see Wikipedia only having to start including a genealogy aspect where everyone can geo tag relatives grave sites and stories about then and their relatives and what they achieved in their life to see a boom in contributors and tie all the history in Wikipedia to real every day people. So when I read in Wikipedia about a civil war or history of a country I can also choose to see who's firends relatives were there at that time etc. Later DNA results can further be added. So I do not see Wikipedia dying if it Innovates.
  •  
    Is Wikipedia a Victim of Its Own Success? is an interesting article, as it suggests that since 2007, the number of people contributing to Wikipedia has decreased (Manjoo, 2009, para. 2). This is further reinforced by the following graph from the Wikipedia website (http://strategy.wikimedia.org/wiki/File:WMFArticlesVsContrib.png), which also shows that the number of contributors is plateauing (Bridgestone Partners, 2009). Farhad Manjoo's explanation for this - that the encyclopedia has "hit the natural limit of knowledge expansion" and the only editing jobs left are 'janitorial' - seems plausible (Manjoo, 2009, para. 6). Personally, this is what I have found through my own use of Wikipedia, that while there are areas which need some work, they are generally topics and jobs which are rather mundane. The success of collaborative projects does rest on ensuring the contributors are enthusiastic about what they are doing, in order for them to continue to produce quality contributions (Anthony, Smith & Williamson, 2007). One of the resources I chose for this assignment further reinforces this. Katherine Ehmann, Andrew Large and Jamshid Beheshti in Collaboration in Context: Comparing Article Evolution among Subject Disciplines in Wikipedia find that through their research, an average of 90.3 percent of the initial Wikipedia article text remained over time (Ehmann et al., 2008, para. 40). Therefore, it seems that contributors are less inclined to change a great deal of the original entry, and if Manjoo's suggestions are correct, and Wikipedia does already cover the majority of the topics required by users, there is less chance that contributors will continue to go back and edit these existing entries. As Dean Strautins suggests in the comment above, Wikipedia may need to look into new ways of continuing to engage their contribu
  •  
    References Anthony, D., Smith, S.W., & Williamson, T. (2007) The Quality of Open Source Production: Zealots and Good Samaritans in the Case of Wikipedia. Dartmouth Computer Science Technical Report TR2007-606. Retrieved from http://www.cs.dartmouth.edu/reports/TR2007-606.pdf Bridgestone Partners. (2009). File: WMFArticlesVsContrib.png. Retrieved from http://strategy.wikimedia.org/wiki/File:WMFArticlesVsContrib.png Ehmann, K., Large, A., & Beheshti, J. (2008). Collaboration in Context: Comparing Article Evolution among Subject Disciplines in Wikipedia. First Monday, 13(10). Retrieved from: http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/2217/2034 Manjoo, F. (2009, September 28). Is Wikipedia a Victim of Its Own Success?. Time. Retrieved from http://www.time.com/time/magazine/article/0,9171,1924492,00.html
  •  
    This article is related to my topic and starts with a brief summary of Wikipedia's start. Wikipedia started its work in 2001 and allowed Wikipedians to contribute and share their articles with others through it. Wikipedia increased its article slowly, in 2008 there were about 2200 articles being added to the Wikipedia every day and in 2009 Wikipedia had about 3 million articles in English. So, Wikipedia broken the record held by Chinese Yongle encyclopedia, which was the famous encyclopedia. The article mentioned, there are thousands of active volunteers who are editing articles or publishing new articles, volunteers check articles to correct them and make them more valid. In addition, in Wikipedia some topics absorb large number of people, for example, subject like "Barack Obama" has large number of viewers, however, other articles which are about other ordinary people do not have that much viewers, and this is a big hole for Wikipedia, because it needs to update these kind of subjects too. The article mentions, In Wikipedia's early days volunteers could be easily be staff of Wikipedia and editing or publishing the articles was not hard, but now volunteers should obey some rules and volunteers must gather some credit to get permission from Wikipedia to publish their articles, so, volunteers may think why should they contribute in Wikipedia and these rsule may decrease the volunteers of Wikipedia.
  •  
    The change in the rate of publishing material does not determine the success of a project such as Wikipedia. New material will be sourced for Wikipedia because the world is constantly evolving. Wikipedia's only downfall is the amount of people that contribute. When Wikipedia articles are monitored by users the mediators can control their own page which they see as perfection because they have written majority of it. This is the exact reason why people have begun to shy away from adding or editing Wikipedia pages. Does this mean however that Wikipedia will fail at some point? I believe nothing could be further from the truth. I think Wikipedia will simply run in cycles as new topics are generated therefore new experts will be required to moderate and new people needed to add subject matter. As more people begin to collaborate on these pages more and more people will feel confident to edit themselves. Think of the Wikipedia cycle as one that is constantly changing with both highs and lows of activity. This current inactive period will not last long. This unit looks at the collaborative process that is being undertaken throughout the web and it is important to understand that without people adding their own pieces the puzzle is never going to be finished. Will Wikipedia run the cycle as my theory predicts?
  •  
    This article brings up a very interesting idea: the concept of an endpoint for Web 2.0 communities. As the author relates it, this would occur as a Malthusian collapse. Whilst at first glance this seems unfeasible given the infinite expanse of virtual pastures, the article makes some interesting points for consideration: the number of contributors on Wikipedia is dropping and it seems the we have run out of topics to write. It is interesting to compare the Wikipedia community to that of Bittorrent which has found renewed growth, and purpose, in the context of its struggle against copyright laws. Wikipedia has been hailed as a revolutionary form of knowledge democratisation, it is hard to imagine that wikipedians don't share a sense of purpose in their collaboration, and, perhaps even harder to imagine that we are running out of things to write about. Whilst this article is from a highly reputable source, its bias might be considered in following that of the conservative media toward copyleft, this is highlighted by phrases like 'Wikipedia's joyride' which suggests the growth of the site as frivolous. Considering the data it presents, the article is certainly very relevant to an understanding of online collaboration and thought provoking. I cannot help but think that there are still multitudes of topics to be written about, how many contributors, for example, have penned a page for themselves? Whilst ostensibly trivial, this might be the kind of interaction that sees renewed interest in the site and attracts the minority demographics which Gardner says the site needs to make its community richer (p.2). Perhaps the flagging interest in the site comes from the reason that the site is moving too close to the status quo, that as the BitTorrent community has seen, it needs to reminded of its position in an ideological shift.
  •  
    This article starts with a brief summary of Wikipedia's start. Wikipedia started its work in 2001 and allowed Wikipedia's to contribute and share their articles with others through it. Wikipedia increased its article slowly, in 2008 there were about 2200 articles being added to the Wikipedia every day and in 2009 Wikipedia had about 3 million articles in English. So, Wikipedia broken the record held by Chinese Yongle encyclopedia, which was the famous encyclopedia (Manjoo, 2009). According to my own studies, Wikipedia has different level of articles; they divided to low-, medium- and high quality and different people must play different roles, such as linking, editing and writing. For example, cleaning up other editor's mistake is a very important part, because some people do not add valuable information and some editors must come to increase articles quality and maybe the article needs another editor to correct the article again and this process may need to continue many times to increase quality of that article. However, that does not mean casual users work is not worthy, because, they can absorb more well-rounded contributors to make more valuable articles. To help contributors, University of Arizona suggested Wiki software, which guides contributors to know what should they do, for example, they will aware the article needs more link, references or it needs more editing and writing (Conger, 2010). Conger, C. (2010). Who writes Wikipedia articles? Retrieved from http://news.discovery.com/human/wikipedia-community-articles.html
Dean Strautins

How organisations framed the 2009 H1N1 pandemic via social and traditional media - 5 views

This paper dazzles me with non stop cramming of terms and references. I simply can not hold all those reference points in my head at the one and same time to be able to come up with insightful lear...

Net308_508 community social media Crowd participatory technology

Tamlin Dobrich

Harnessing the Wisdom of Crowds in Wikipedia: Quality Through Coordination - 5 views

  •  
    Kittur, A., & Kraut, R. (2008). Harnessing the Wisdom of Crowds in Wikipedia: Quality Through Coordination. Carnegie Mellon University. Retrieved 2012, March 19th from http://kraut.hciresearch.org/sites/kraut.hciresearch.org/files/articles/Kittur08-WikipediaWisdomOfCrowds_CSCWsubmitted.pdf Harnessing the Wisdom of Crowds in Wikipedia: Quality Through Coordination is a study that looks into "the critical importance of coordination in effectively harnessing the "wisdom of the crowds" in online production environments". The article suggests that Wikipedia's success is reliant on significant and varied coordination from its users and not just determined by a large and diverse author-base as proposed in other studies (Arazy, Morgan, Ofer, Patterson, Raymond & Wayne, 2006). Elements such as editor(s) coordination methods, article lifecycle, and task interdependence determine whether a large author-base will be effective or counteractive in achieving high Wikipedia entry quality. The study found that unspoken expectations and a shared understanding (implicit coordination) between authors encouraged positive results when collaborating with a large author-base however more editors promoted a negative effect on article quality when using direct communication and verbal planning (explicit coordination). During the early stages of article development, both implicit and explicit coordination tend to promote content quality because author(s) need to establish structure, direction and scope of the article. For these high-coordination tasks, the study found it was more beneficial to have a small or core group of editors to set direction and as the article became more established, value can be maximized by distributing low-coordination tasks, such as fixing grammar, correcting vandalism and creating links, to a larger author-base.
  •  
    This paper discusses the how online community can increase the size and quality of Wikipedia's article. In Wikipedia 40% of edits have done with the help of discussion page, which they focus on development of policies and procedures, communication and consensus building. Most of the editors read discussion page to know how they can increase the quality of the articles (Kittur & Kraut, n.d). According to my own studies, the most exiting research on Wikipedia belongs on how many times an article needs to have the highest quality? And why some articles have high level of quality and others not? Some contributors like to read and edit articles with similar subject and they do not edit other articles. So, Wikipedia, needs some soft wares to ask contributors' the duties they should do. For example, one article needs more reference link and another one needs more grammar correcting and of course, there are some people who their interest is finding relevant links or there are some others who like to correct grammatical mistake and they just need to know which article needs their help, so, these kind of soft wares can assert to contributors needs of articles and help of contributors Wikipedia can have equal level of quality for its articles (Ram, 2010). Ram, S. (2010). Who does what on Wikipedia? Available on http://www.redorbit.com/news/technology/1832403/who_does_what_on_wikipedia/
Tamlin Dobrich

Wikipedia: organisation from a bottom-up approach - 3 views

  •  
    Jaap van den Herik, H., Postma, E., & Spek, S. (2006). Wikipedia: organisation from a bottom-up approach. Maastricht University. Retrieved 2012, March 19th from http://arxiv.org/pdf/cs/0611068v2.pdf The article Wikipedia: organisation from a bottom-up approach is a study into Wikipedia as a successful self-managing team via the analysis of the Dutch Wikipedia. The study explores how Wikipedia successfully creates a cohesive and logical data structure through bottom-up organisation in which labour division is autonomous. The article suggests that this bottom-up structure, with many contributors working towards a common goal, enables greater speed and efficiency subsequently allowing Wikipedia to update new developments faster than other encyclopedias. Additionally this structure, coupled with the online nature of the information network, encourages more communication and cooperation between divisions, increased enthusiasm in participants, and decreased managerial overheads. In terms of Wikipedia's content organisation, a sample study of Wikipedia articles demonstrated article clustering, scale-freeness, and potentially even small-worldliness indicating that Wikipedia's content is itself an organised network. Finally the article looks into the varying Wikipedia pieces and author types and analyzes their relationship. The study found that articles which receive a low average of edits per author (average of edits = number of edits on an article divided by the number of unique authors on the same article) in general "deal with topic areas that most people have at least some expertise in, or topic areas that everyone claims to know about". Contrastingly articles with a high average of edits per author were generally more specialized topics. What this means is that articles, which cover mainstream topics, attract a larger and more diverse crowd of authors (
Jarrad Long

Reips, U-D & Garaizar, P. (2011) Mining Twitter: A source for psychological wisdom of t... - 10 views

This article discusses the usefulness of Twitter as a tool for research. Researcher Pablo Garaizar suggests that monitoring large volumes of tweets and identifying trends in what users are saying -...

Net308_508 collaboration Crowd participatory

Velia Torres

Interactive of reactive? Marketing with Twitter - 15 views

This paper aims to analyse the effectiveness of Twitter usage across six different organisations, holding twelve different Twitter accounts. Despite the high amount of organisations using Twitter t...

Net308_508 collaboration community Crowd participatory technology

Stephen R

Online Activism - 14 views

My topic of choice is the Anonymous activist group who's activities are often, but not exclusively, enacted online. The online activism by Anonymous is similar to the online activism described in Y...

Net308_508 collaboration organisation crowds china kony 2012 online activism

« First ‹ Previous 41 - 60 of 78 Next ›
Showing 20 items per page