Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged ecommerce

Rss Feed Group items tagged

Carsten Ullrich

EUR-Lex - 52003DC0702 - EN - EUR-Lex - 0 views

  • Article 15 prevents Member States from imposing on internet intermediaries, with respect to activities covered by Articles 12-14, a general obligation to monitor the information which they transmit or store or a general obligation to actively seek out facts or circumstances indicating illegal activities. This is important, as general monitoring of millions of sites and web pages would, in practical terms, be impossible and would result in disproportionate burdens on intermediaries and higher costs of access to basic services for users. [73] However, Article 15 does not prevent public authorities in the Member States from imposing a monitoring obliga tion in a specific, clearly defined individual case.[73] In this context, it is important to note that the reports and studies on the effectiveness of blocking and filtering applications appear to indicate that there is not yet any technology which could not be circumvented and provide full effectiveness in blocking or filtering illegal and harmful information whilst at the same time avoiding blocking entirely legal information resulting in violations of freedom of speech.
    • Carsten Ullrich
       
      justifications mainly relate to economic viability and overblocking, but not surveillance
  •  
    justification for Article 15
Carsten Ullrich

Systemic Duties of Care and Intermediary Liability - Daphne Keller | Inforrm's Blog - 0 views

  • ursuing two reasonable-sounding goals for platform regulation
  • irst, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal m
  • Second, they want to preserve existing immunitie
  • ...8 more annotations...
  • ystemic duty of care” is a legal standard for assessing a platform’s overall system for handling harmful online content. It is not intended to define liability for any particular piece of content, or the outcome of particular litigation disputes.
  • The basic idea is that platforms should improve their systems for reducing online harms. This could mean following generally applicable rules established in legislation, regulations, or formal guidelines; or it could mean working with the regulator to produce and implement a platform-specific plan.
  • In one sense I have a lot of sympathy for this approach
  • In another sense, I am quite leery of the duty of care idea.
  • he actions platforms might take to comply with a SDOC generally fall into two categories. The first encompasses improvements to existing notice-and-takedown systems.
  • he second SDOC category – which is in many ways more consequential – includes obligations for platforms to proactively detect and remove or demote such content.
  • Proactive Monitoring Measures
    • Carsten Ullrich
       
      this is a bit too narrow, proactivity means really a rsk based approach, nit just monitoring, but monitoring for threats and risks
  • The eCommerce Directive and DMCA both permit certain injunctions, even against intermediaries that are otherwise immune from damages. Here again, the platform’s existing capabilities – its capacity to know about and control user content – matter. In the U.K. Mosley v. Google case, for example, the claimant successfully argued that because Google already used technical filters to block illegal child sexual abuse material, it could potentially be compelled to filter the additional images at image in his case.
1 - 3 of 3
Showing 20 items per page