Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged Germany

Rss Feed Group items tagged

Carsten Ullrich

LG Würzburg, Urteil v. 07.03.2017 - 11 O 2338/16 UVR - Bürgerservice - 0 views

  • Diensteanbieter, die von Nutzern bereitgestellte Informationen speichern, wie die Verfügungsbeklagte, müssen gemäß Erwägungsgrund 48 der ECRL außerdem die nach vernünftigem Ermessen von ihnen zu erwartende und in innerstaatlichen Rechtsvorschriften niedergelegter Sorgfaltspflicht anwenden, um bestimmte Arten rechtswidriger Tätigkeiten aufzudecken und zu verhindern.
  • sondern auch solche Inhalte als „eigene Inhalte“ gewertet werden, die sich der Diensteanbieter zu eigen gemacht hat.
  • Danach muss anhand der Umstände des Einzelfalls unter Berücksichtigung und Abwägung aller betroffenen Interessen und relevanten rechtlichen Wertungen bestimmt werden, welche Maßnahmen zumutbar sind. Zu diesen Umständen gehören Funktionen und Aufgabenstellung des angebotenen Dienstes, Risiko und Anzahl möglicher Rechtsverletzungen sowie die Eigenverantwortung des Verletzten, der wirtschaftliche Vorteil in Gestalt von Provisionen aus Rechtsverletzungen durch Dritte, die Werbung für mögliche rechtswidrige Aktivitäten, Erleichterungen von Rechtsverletzungen durch zur Verfügungstellung von Hilfsmitteln sowie Software, der wirtschaftliche Aufwand von Prüfmaßnahmen sowie die Effektivität grundsätzlich möglicher Prüf- und Sicherungsmaßnahmen auch im Hinblick auf Maßnahmen zur Vermeidung einer möglichen Vielzahl gleichartiger Verletzungen der Rechte in Bezug auf andere
  • ...4 more annotations...
  • Unzumutbar sind Maßnahmen nicht schon allein dadurch, dass der Schuldner zusätzliches Personal für die Kontrolle einsetzen müsste, sondern erst, wenn durch den Überprüfungsaufwand das Geschäftsmodell in Frage gestellt würde.
  • Aufgrund von Zumutbarkeitserwägungen kann eine Prüfungspflicht auf klare, d. h. grobe, ohne weitere Nachforschungen unschwer zu erkennende Verstöße beschränkt bleiben.
  • Gesteigerte Prüfungspflichten können sich auch bei Providern ergeben, die es ihren Kunden ermöglichen, ihre Dienste anonym zu nutzen, so dass der Kunde im Bedarfsfall als Rechtsverletzer nicht identifiziert werden könnte (Spindler/Volkmann, in: Spindler/Schuster, Recht der elektronischen Medien, 3. Aufl. 2015, § 1004 BGB Rn. 25, mit weiteren Nachweisen).
  • Allerdings lässt sich die Frage der technischen Machbarkeit und damit auch der Zumutbarkeit im Verfügungsverfahren noch nicht sicher beurteilen. Dies sprengt den Rahmen eines Verfügungsverfahrens und wird im Hauptsacheverfahren, ggf. durch Gutachten, überprüft werden müssen.
Carsten Ullrich

Article - 0 views

  • new measures are designed to make it easier to identify hate crime on the Internet. In future, platforms such as Facebook, Twitter and YouTube will not only be able to delete posts that incite hatred or contain death threats, but also report them to the authorities, along with the user’s IP address.
  • ossibility of extending the scope of the Netzwerkdurchsetzungsgesetz
  • new rules on hate crime will be added to the German Strafgesetzbuch (Criminal Code), while the definition of existing offences will be amended to take into account the specific characteristics of the Internet.
    • Carsten Ullrich
       
      internet specific normative considerations?
Carsten Ullrich

Article - 0 views

  • Entwurf für ein Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität
  • oviders of commercial telemedia services and associated contributors and intermediaries will, in future, be subject to the same information obligations as telecommunications services. A new Article 15a TMG obliges them to disclose information about their users’ inventory data if requested by the Federal Office for the Protection of the Constitution, law enforcement or police authorities, the Militärische Abschirmdienst (Military Counterintelligence Service), the Bundesnachrichtendienst (Federal Intelligence Service) or customs authorities
  • To this end, they are required, at their own expense, to make arrangements for the disclosure of such information within their field of responsibility. Services with over 100 000 customers must also provide a secure electronic interface for this purpose.
  • ...2 more annotations...
  • Social network providers, meanwhile, are subject to proactive reporting obligations
  • The provider must check whether this is the case and report the content immediately, as well as provide the IP address and port number of the person responsible. The user “on whose behalf the content was stored” should be informed that the information has been passed on to the BKA, unless the BKA orders otherwise.
Carsten Ullrich

Tech companies can distinguish between free speech and hate speech if they want to - Da... - 0 views

  • Facebook has come under recent criticism for censoring LGBTQ people’s posts because they contained words that Facebook deem offensive. At the same time, the LGBTQ community are one of the groups frequently targetted with hate speech on the platform. If users seem to “want their cake and eat it too”, the tech companies are similarly conflicted.
  • At the same time, the laws of many countries like Germany, and other international conventions, explicitly limit these freedoms when it comes to hate speech.
  • It would not be impossible for tech companies to form clear guidelines within their own platforms about what was and wasn’t permissable. For the mainly US companies, this would mean that they would have to be increasingly aware of the differences between US law and culture and those of other countries.
Carsten Ullrich

Algorithm Transparency: How to Eat the Cake and Have It Too - European Law Blog - 0 views

  • While AI tools still exist in a relative legal vacuum, this blog post explores: 1) the extent of protection granted to algorithms as trade secrets with exceptions of overriding public interest; 2) how the new generation of regulations on the EU and national levels attempt to provide algorithm transparency while preserving trade secrecy; and 3) why the latter development is not a futile endeavour. 
  • most complex algorithms dominating our lives (including those developed by Google and Facebook), are proprietary, i.e. shielded as trade secrets, while only a negligible minority of algorithms are open source. 
  • Article 2 of the EU Trade Secrets Directive
  • ...11 more annotations...
  • However, the protection granted by the Directive is not absolute. Article 1(2)(b), bolstered by Recital 11, concedes that secrecy will take a back seat if the ‘Union or national rules require trade secret holders to disclose, for reasons of public interest, information, including trade secrets, to the public or to administrative or judicial authorities for the performance of the duties of those authorities’. 
  • With regard to trade secrets in general, in the Microsoft case, the CJEU held that a refusal by Microsoft to share interoperability information with a competitor constituted a breach of Article 102 TFEU.
  • Although trade secrets remained protected from the public and competitors, Google had to disclose Page Rank parameters to the Commission as the administrative authority for the performance of its investigative duties. It is possible that a similar examination will take place in the recently launched probe in Amazon’s treatment of third-party sellers. 
  • For instance, in February 2020, the District Court of the Hague held that the System Risk Indication algorithm that the Dutch government used to detect fraud in areas such as benefits, allowances, and taxes, violated the right to privacy (Article 8 ECHR), inter alia, because it was not transparent enough, i.e. the government has neither publicized the risk model and indicators that make up the risk model, nor submitted them to the Court (para 6 (49)).
  • Article 22 still remains one of the most unenforceable provisions of the GDPR. Some scholars (see, e.g. Wachter) question the existence of such a right to explanation altogether claiming that if the right does not withstand the balancing against trade secrets, it is of little value.
  • In 2019, to ensure competition in the platform economy, the European Parliament and the Council adopted Platform-to-Business (P2B) Regulation. To create a level playing field between businesses, the Regulation for the first time mandates the platforms to disclose to the businesses the main parameters of the ranking systems they employ, i.e. ‘algorithmic sequencing, rating or review mechanisms, visual highlights, or other saliency tools’ while recognising the protection of algorithms by the Trade Secrets Directive (Article 1(5)).
  • The recent Guidelines on ranking transparency by the European Commission interpret the ‘main parameters’ to mean ‘what drove the design of the algorithm in the first place’ (para 41).
  • The German Interstate Media Law that entered into force in October 2020, transposes the revised Audio-Visual Services Directive, but also goes well beyond the Directive in tackling automated decision-making that leads to prioritization and recommendation of content.
  • This obligation to ‘explain the algorithm’ makes it the first national law that, in ensuring fairness for all journalistic and editorial offers, also aims more generally at diversity of opinion and information in the digital space – a distinct human rights dimension. If the provision proves enforceable, it might serve as an example for other Member States to emulate. 
  • Lastly, the draft DSA grants the newly introduced Digital Service Coordinators, the Commission, as well as vetted researchers (under conditions to be specified) the powers of data access to ensure compliance with the DSA. The core of this right, however, is undermined in Article 31(6), which effectively allows the platforms to refuse such access based on trade secrecy concerns. 
  • This shows that although addressing algorithms in a horizontal instrument is a move in the right direction, to make it enforceable, the final DSA, as well as any ensuing guidelines, should differentiate between three tiers of disclosure: 1) full disclosure – granting supervisory bodies the right of access, which may not be refused by the IP owners, to all confidential information; 2) limited disclosure – granting vetted researchers the right of access limited in time and scope, with legal guarantees for protection of trade secrecy; and 3) explanation of main parameters – granting individuals information in accessible language without prejudice to trade secrets. 
1 - 10 of 10
Showing 20 items per page