Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged monitoring

Rss Feed Group items tagged

Carsten Ullrich

CG v Facebook Ireland Ltd & Anor [2016] NICA 54 (21 December 2016) - 0 views

  • The commercial importance of ISS providers is recognised in Recital 2 of the Directive which notes the significant employment opportunities and stimulation of economic growth and investment in innovation from the development of electronic commerce. The purpose of the exemption from monitoring is to make the provision of the service practicable and to facilitate the opportunities for commercial activity. The quantities of information described by the learned trial judge at paragraph [19] of his judgment explain why such a provision is considered necessary. Although the 2002 Regulations do not contain a corresponding provision they need to be interpreted with the monitoring provision in mind.
  • Given the quantities of information generated the legislative steer is that monitoring is not an option
  • he judge concluded that the existence of the XY litigation was itself sufficient to fix Facebook with actual knowledge of unlawful disclosure of information on Predators 2 or awareness of facts and circumstances from which it would have been apparent that the publication of the information constituted misuse of private information. In our view such a liability could only arise if Facebook was subject to a monitoring obligation
Carsten Ullrich

EUR-Lex - 52003DC0702 - EN - EUR-Lex - 0 views

  • Article 15 prevents Member States from imposing on internet intermediaries, with respect to activities covered by Articles 12-14, a general obligation to monitor the information which they transmit or store or a general obligation to actively seek out facts or circumstances indicating illegal activities. This is important, as general monitoring of millions of sites and web pages would, in practical terms, be impossible and would result in disproportionate burdens on intermediaries and higher costs of access to basic services for users. [73] However, Article 15 does not prevent public authorities in the Member States from imposing a monitoring obliga tion in a specific, clearly defined individual case.[73] In this context, it is important to note that the reports and studies on the effectiveness of blocking and filtering applications appear to indicate that there is not yet any technology which could not be circumvented and provide full effectiveness in blocking or filtering illegal and harmful information whilst at the same time avoiding blocking entirely legal information resulting in violations of freedom of speech.
    • Carsten Ullrich
       
      justifications mainly relate to economic viability and overblocking, but not surveillance
  •  
    justification for Article 15
Carsten Ullrich

Systemic Duties of Care and Intermediary Liability - Daphne Keller | Inforrm's Blog - 0 views

  • ursuing two reasonable-sounding goals for platform regulation
  • irst, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal m
  • Second, they want to preserve existing immunitie
  • ...8 more annotations...
  • ystemic duty of care” is a legal standard for assessing a platform’s overall system for handling harmful online content. It is not intended to define liability for any particular piece of content, or the outcome of particular litigation disputes.
  • The basic idea is that platforms should improve their systems for reducing online harms. This could mean following generally applicable rules established in legislation, regulations, or formal guidelines; or it could mean working with the regulator to produce and implement a platform-specific plan.
  • In one sense I have a lot of sympathy for this approach
  • In another sense, I am quite leery of the duty of care idea.
  • he actions platforms might take to comply with a SDOC generally fall into two categories. The first encompasses improvements to existing notice-and-takedown systems.
  • he second SDOC category – which is in many ways more consequential – includes obligations for platforms to proactively detect and remove or demote such content.
  • Proactive Monitoring Measures
    • Carsten Ullrich
       
      this is a bit too narrow, proactivity means really a rsk based approach, nit just monitoring, but monitoring for threats and risks
  • The eCommerce Directive and DMCA both permit certain injunctions, even against intermediaries that are otherwise immune from damages. Here again, the platform’s existing capabilities – its capacity to know about and control user content – matter. In the U.K. Mosley v. Google case, for example, the claimant successfully argued that because Google already used technical filters to block illegal child sexual abuse material, it could potentially be compelled to filter the additional images at image in his case.
Carsten Ullrich

XY v Facebook Ireland Ltd [2012] NIQB 96 (30 November 2012) - 0 views

  • [19] The Order of the Court will incorporate provision for liberty to apply. By this mechanism the Plaintiff, if necessary and if so advised, will be able to seek further relief from the Court if there is any recurrence of the offending publication. Of course, in such eventuality, it will be open to Facebook, acting responsibly and in accordance with the principles and themes clearly expressed in this judgment, to proactively take the necessary removal and closure steps.
  • [20] I refuse the Plaintiff's application for the wider form of interim injunction sought by him. This was to the effect that Facebook be required to monitor the offending webpage in order to prevent republication of the offensive material. In this respect, I prefer the argument of Mr Hopkins that such an order would lack the requisite precision, could impose a disproportionate burden and, further, would potentially require excessive supervision by the Court. See Cooperative Insurance v Argyll [1997] 3AL ER 297, pages 303 – 304, per Lord Hoffman. See also Halsbury's Laws of England, Volume 24 (Fourth Edition Reissue), paragraph 849. The propriety of granting this discrete remedy will, of course, be revisited at the substantive trial, against the backcloth of a fuller evidential matrix, which should include details of how this social networking site actually operates from day to day.
Carsten Ullrich

The secret lives of Facebook moderators in America - The Verge - 0 views

  • It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let go.
  • The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”
  • The use of contract labor also has a practical benefit for Facebook: it is radically cheaper. The median Facebook employee earns $240,000 annually in salary, bonuses, and stock options. A content moderator working for Cognizant in Arizona, on the other hand, will earn just $28,800 per year. The arrangement helps Facebook maintain a high profit margin. In its most recent quarter, the company earned $6.9 billion in profits, on $16.9 billion in revenue. And while Zuckerberg had warned investors that Facebook’s investment in security would reduce the company’s profitability, profits were up 61 percent over the previous year.
  • ...3 more annotations...
  • Miguel takes a dim view of the accuracy figure. “Accuracy is only judged by agreement. If me and the auditor both allow the obvious sale of heroin, Cognizant was ‘correct,’ because we both agreed,” he says. “This number is fake.”
  • Even with an ever-changing rulebook, moderators are granted only the slimmest margins of error. The job resembles a high-stakes video game in which you start out with 100 points — a perfect accuracy score — and then scratch and claw to keep as many of those points as you can. Because once you fall below 95, your job is at risk. If a quality assurance manager marks Miguel’s decision wrong, he can appeal the decision. Getting the QA to agree with you is known as “getting the point back.” In the short term, an “error” is whatever a QA says it is, and so moderators have good reason to appeal every time they are marked wrong. (Recently, Cognizant made it even harder to get a point back, by requiring moderators to first get a SME to approve their appeal before it would be forwarded to the QA.)
  • eforeBefore Miguel can take a break, he clicks a browser extension to let Cognizant know he is leaving his desk. (“That’s a standard thing in this type of industry,” Facebook’s Davidson tells me. “To be able to track, so you know where your workforce is.”)
  •  
    "Pro Unlimited"
Carsten Ullrich

EUR-Lex - COM:2017:795:FIN - EN - EUR-Lex - 0 views

  • . In e-commerce in particular, market surveillance authorities have great difficulty tracing non-compliant products imported into the Union and identifying the responsible entity within their jurisdiction.
  • In its 2017 work programme 4 , the Commission announced an initiative to strengthen product compliance and enforcement Union harmonisation legislation on products, as part of the 'Goods Package'. The initiative is to address the increasing amount of non-compliant products on the Union market while offering incentives to boost regulatory compliance and ensuring fair and equal treatment that will benefit of businesses and citizens.
  • The development of e-commerce is also due to a great extent to the proliferation of information society service providers, normally through platforms and for remuneration, which offer intermediary services by storing third party content, but without exercising any control over such content, thus not acting on behalf of an economic operator. Removal of content regarding non-compliant products or where it is not feasible blocking access to non-compliant products offered through their services should be without prejudice to the rules laid down in Directive 2000/31/EC of the European Parliament and of the Council 55 . In particular, no general obligation should be imposed on service providers to monitor the information which they transmit or store, nor should a general obligation be imposed upon them to actively seek facts or circumstances indicating illegal activity. Furthermore, hosting service providers should not be held liable as long as they do not have actual knowledge of illegal activity or information and are not aware of the facts or circumstances from which the illegal activity or information is apparent.
  • ...4 more annotations...
  • Those powers should be sufficiently robust to tackle the enforcement challenges of Union harmonisation legislation, along with the challenges of e-commerce and the digital environment and to prevent economic operators from exploiting gaps in the enforcement system by relocating to Member States whose market surveillance authorities are not equipped to tackle unlawful practices. In particular, the powers should ensure that information and evidence can be exchanged between competent authorities so that enforcement can be undertaken equally in all Member States.
  • Compliance rates by Member State/sectors and for e-commerce and imports (improvements in availability and quality of information in Member State enforcement strategies, progress in reduction of compliance gaps)
  • (3) low deterrence of the current enforcement tools, notably with respect to imports from third countries and e-commerce
  • (4) important information gaps (i.e. lack of awareness of rules by businesses and little transparency as regards product compliance)
Carsten Ullrich

JIPLP: Editorial - Control of content on social media - 0 views

  • Can technology resolve these issues? As regards technical solutions, there are already examples of these, such as YouTube’s Content ID, an automated piece of software that scans material uploaded to the site for IP infringement by comparing it against a database of registered IPs. The next challenge may be how these types of systems can be harnessed by online platform providers to address extreme and hate crime content. Again the dilemma for policy- and law-makers may be the extent to which they are prepared to cede control over content to technology companies, which will become judge, jury and executioner. 
  • who should bear the cost of monitoring and removal.
  • o block access to websites where infringing content has been hosted. In Cartier International AG & Ors v British Sky Broadcasting Ltd & Ors [2016] EWCA civ 658 the Court of Appeal concluded that it is entirely reasonable to expect ISPs to pay the costs associated with implementing mechanisms to block access to sites where infringing content has been made available
  • ...1 more annotation...
  • Thus the cost of implementing the order could therefore be regarded as just another overhead associated with ISPs carrying on their business
Carsten Ullrich

Upload filters, copyright and magic pixie dust - Copybuzz - 0 views

  • At the heart of the initiative is a plan for online platforms to “increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.” Significantly, the ideas are presented as “guidelines and principles”. That’s because they are entirely voluntary. Except that the Commission makes it quite clear that if this totally voluntary system is not implemented by companies like Facebook and Google, it will bring in new laws to make them do it on a not-so-voluntary basis. The Commission is quite eager to see swift results from these voluntary efforts, as legislative proposals could already be on the table by May 2018.
  • But the worst idea, and one that appears multiple times in the latest plans, is the routine and pervasive use of upload filters.
  • In doing so, they have caused notable collateral damage, especially to fundamental rights.
  • ...3 more annotations...
  • The European Commission is well aware that Article 15 of the E-Commerce Directive explicitly prohibits Member States from imposing “a general obligation on providers … to monitor the information which they transmit or store, [or] a general obligation actively to seek facts or circumstances indicating illegal activity.
  • does indeed involve a “general obligation” on those companies to filter all uploads for a vast range of “illegal content”
  • That lack of good faith makes the Commission’s stubborn insistence on a non-existent technical solution to a non-existent problem even more frustrating. If it had the courage to admit the truth about the unproblematic nature of unauthorised sharing of copyright materials, it wouldn’t need to come up with unhelpful approaches like upload filters that are certain to cause immense harm to both the online world and to the EU’s Digital Single Market.
Carsten Ullrich

CopyCamp Conference Discusses Fallacies Of EU Copyright Reform Amid Ideas For Copy Chan... - 0 views

  • Beyond the potential negative economic aspects, several speakers at the Copycamp conference rang the alarm bells over the potential fallout of round-the-clock obligatory monitoring and filtering of user content on the net. Diego Naranjo from the European Digital Rights initiative (EDRi) reported: “I heard one of the EU member state representatives say, ‘Why do we use this (filtering system) only for copyright?’,” he said. The idea of bringing down the unauthorised publication of copyrighted material by algorithm was “a very powerful tool in the hands of government,” he warned.
  • In contrast to the dark picture presented by many activists on copyright, multi-purpose filtering machines and the end of ownership in the time of the internet of things, chances for reform are presented for various areas of rights protection.
  • EU copyright reform itself is a chance, argued Raegan MacDonalds from the Mozilla Foundation, calling it “the opportunity of a generation to bring copyright in line with the digital age, and we want to do that.” Yet the task, like in earlier copyright legislative processes, is to once more expose what she described as later dismantled myths of big rights holders, that any attempt to harmonise exceptions would kill their industry.
1 - 11 of 11
Showing 20 items per page