Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged Europe

Rss Feed Group items tagged

Carsten Ullrich

Council of Europe - ETS No. 185 - Convention on Cybercrime - 0 views

  • Recognising the need for co-operation between States and private industry
  • need to protect legitimate interests
  • roper balance between the interests of law enforcement and respect for fundamental human rights
  • ...11 more annotations...
  • right to freedom of expression, including the freedom to seek, receive, and impart information and ideas of all kinds, regardless of frontiers, and the rights concerning the respect for privacy;
  • United Nations, the OECD
  • European Union and the G8
  • establish as criminal offences under its domestic law,
  • producing child pornography
  •   offering or making available child pornography
  • distributing or transmitting
  • procuring
  • possessing
  • expeditious preservation of traffic data is available
  • expeditious disclosure to the Party’s competent authority,
Carsten Ullrich

HUDOC - European Court of Human Rights - 0 views

  • Thus, the Court considers that the applicant company was in a position to assess the risks related to its activities and that it must have been able to foresee, to a reasonable degree, the consequences which these could entail. It therefore concludes that the interference in issue was “prescribed by law” within the meaning of the second paragraph of Article 10 of the Convention.
  • The Court has found that persons carrying on a professional activity, who are used to having to proceed with a high degree of caution when pursuing their occupation, can on this account be expected to take special care in assessing the risks that such activity entails
  • Against that background, the Chamber considered that the applicant company had been in a position to assess the risks related to its activities and that it must have been able to foresee, to a reasonable degree, the consequences which these could entail.
  • ...2 more annotations...
  • Thus, the Court notes that the applicant company cannot be said to have wholly neglected its duty to avoid causing harm to third parties. Nevertheless, and more importantly, the automatic word-based filter used by the applicant company failed to filter out odious hate speech and speech inciting violence posted by readers and thus limited its ability to expeditiously remove the offending comments
  • Lastly, the Court observes that the applicant company has argued (see paragraph 78 above) that the Court should have due regard to the notice-and-take-down system that it had introduced. If accompanied by effective procedures allowing for rapid response, this system can in the Court’s view function in many cases as an appropriate tool for balancing the rights and interests of all those involved. However, in cases such as the present one, where third-party user comments are in the form of hate speech and direct threats to the physical integrity of individuals, as understood in the Court’s case-law (see paragraph 136 above), the Court considers, as stated above (see paragraph 153), that the rights and interests of others and of society as a whole may entitle Contracting States to impose liability on Internet news portals, without contravening Article 10 of the Convention, if they fail to take measures to remove clearly unlawful comments without delay, even without notice from the alleged victim or from third parties.
Carsten Ullrich

The battle against disinformation is global - Scott Shackelford | Inforrm's Blog - 0 views

  • the EU is spending more money on combating disinformation across the board by hiring new staff with expertise in data mining and analytics to respond to complaints and proactively detect disinformation
  • EU also seems to be losing patience with Silicon Valley. It pressured social media giants like Facebook, Google and Twitter to sign the Code of Practice on Disinformation in 2018.
Carsten Ullrich

Problems with Filters in the European Commission's Platforms Proposal - Daphne Keller |... - 0 views

  • ey are shockingly expensive – YouTube’s ContentID had cost Google $60 million as of several years ago – so only incumbents can afford them. Start-ups forced to build them won’t be able to afford it, or will build lousy ones with high error rates. Filters address symptoms and leave underlying problems to fester – like, in the case of radical Islamist material, the brutal conflict in Syria, global refugee crisis, and marginalization of Muslim immigrants to the US and Europe. All these problems make filters incredibly hard to justify without some great demonstrated upside – but no one has demonstrated such a thing.
  • The DMCA moves literally billions of disputes about online speech out of courts and into the hands of private parties.
  • That allocative choice was reasonable in 1998, and it remains reasonable in 2016.
    • Carsten Ullrich
       
      I dont think so.
  • ...1 more annotation...
  • The Internet has grown exponentially in size since the DMCA was enacted, but we should not forget that the problem of large-scale infringement was an expected development—and one that the safe harbors were specifically designed to manage.
    • Carsten Ullrich
       
      any proof for that assertion?
Carsten Ullrich

Internet law - 0 views

  •  
    "ntelligence platform "
1 - 8 of 8
Showing 20 items per page