How Robots Are Failing to Protect Copyright | Variety - 0 views
-
Yee Lee Chen on 05 Apr 16The robots aren't doing so hot when it comes to policing copyright infringement. Computers are answering to copyright take down requests and some people are misusing it. 2 stakeholders: The people posting things online and the original creators of content that are being pirated online. 2 social/ethical considerations: If a new movie was to be leaked online then it would cause the movie creators to lose a lot of money as more people would just watch the movie for free online. With the computers deciding on the copyright requests, they can target blogs and personal websites which use other contents to post reviews or summaries or fan pages. This can potentially infringe on freedom of expression. A possible solution: A solution to prevent blogs and personal websites from being targeted is to have actual people doing the job instead of robots as us people would be able to decide whether the website is required to taken down or is safe under the creative commons law. "This infringement of copyright is called "fair use" and is allowed for purposes of criticism, news, reporting, teaching, and parody." A robot would not be able to tell between these and would take down innocent blogs while possibly leaving some that are really pirating content up on the internet. List of IT systems: automated bot-based systems