Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged enforcement

Rss Feed Group items tagged

Carsten Ullrich

EUR-Lex - COM:2017:795:FIN - EN - EUR-Lex - 0 views

  • . In e-commerce in particular, market surveillance authorities have great difficulty tracing non-compliant products imported into the Union and identifying the responsible entity within their jurisdiction.
  • In its 2017 work programme 4 , the Commission announced an initiative to strengthen product compliance and enforcement Union harmonisation legislation on products, as part of the 'Goods Package'. The initiative is to address the increasing amount of non-compliant products on the Union market while offering incentives to boost regulatory compliance and ensuring fair and equal treatment that will benefit of businesses and citizens.
  • The development of e-commerce is also due to a great extent to the proliferation of information society service providers, normally through platforms and for remuneration, which offer intermediary services by storing third party content, but without exercising any control over such content, thus not acting on behalf of an economic operator. Removal of content regarding non-compliant products or where it is not feasible blocking access to non-compliant products offered through their services should be without prejudice to the rules laid down in Directive 2000/31/EC of the European Parliament and of the Council 55 . In particular, no general obligation should be imposed on service providers to monitor the information which they transmit or store, nor should a general obligation be imposed upon them to actively seek facts or circumstances indicating illegal activity. Furthermore, hosting service providers should not be held liable as long as they do not have actual knowledge of illegal activity or information and are not aware of the facts or circumstances from which the illegal activity or information is apparent.
  • ...4 more annotations...
  • Those powers should be sufficiently robust to tackle the enforcement challenges of Union harmonisation legislation, along with the challenges of e-commerce and the digital environment and to prevent economic operators from exploiting gaps in the enforcement system by relocating to Member States whose market surveillance authorities are not equipped to tackle unlawful practices. In particular, the powers should ensure that information and evidence can be exchanged between competent authorities so that enforcement can be undertaken equally in all Member States.
  • Compliance rates by Member State/sectors and for e-commerce and imports (improvements in availability and quality of information in Member State enforcement strategies, progress in reduction of compliance gaps)
  • (3) low deterrence of the current enforcement tools, notably with respect to imports from third countries and e-commerce
  • (4) important information gaps (i.e. lack of awareness of rules by businesses and little transparency as regards product compliance)
Carsten Ullrich

What Facebook isn't telling us about its fight against online abuse - Laura Bliss | Inf... - 0 views

  • In a six-month period from October 2017 to March 20178, 21m sexually explicit pictures, 3.5m graphically violent posts and 2.5m forms of hate speech were removed from its site. These figures help reveal some striking points.
  • As expected, the data indicates that the problem is getting worse.
    • Carsten Ullrich
       
      problem is getting worse - use as argument - look at facebook report
  • For instance, between January and March it was estimated that for every 10,000 messages online, between 22 and 27 contained graphic violence, up from 16 to 19 in the previous three months.
  • ...9 more annotations...
  • Here, the company has been proactive. Between January and March 2018, Facebook removed 1.9m messages encouraging terrorist propaganda, an increase of 800,000 comments compared to the previous three months. A total of 99.5% of these messages were located with the aid of advancing technology.
  • But Facebook hasn’t released figures showing how prevalent terrorist propaganda is on its site. So we really don’t know how successful the software is in this respect.
    • Carsten Ullrich
       
      we need data this would be part of my demand for standardized reporting system
  • on self-regulation,
  • Between the two three-month periods there was a 183% increase in the amount of posts removed that were labelled graphically violent. A total of 86% of these comments were flagged by a computer system.
  • But we also know that Facebook’s figures also show that up to 27 out of every 10,000 comments that made it past the detection technology contained graphic violence.
  • One estimate suggests that 510,000 comments are posted every minute. If accurate, that would mean 1,982,880 violent comments are posted every 24 hours.
  • Facebook has also used technology to aid the removal of graphic violence from its site.
  • This brings us to the other significant figure not included in the data released by Facebook: the total number of comments reported by users. As this is a fundamental mechanism in tackling online abuse, the amount of reports made to the company should be made publicly available
  • However, even Facebook still has a long way to go to get to total transparency. Ideally, all social networking sites would release annual reports on how they are tackling abuse online. This would enable regulators and the public to hold the firms more directly to account for failures to remove online abuse from their servers.
    • Carsten Ullrich
       
      my demand - standardized reporting
Carsten Ullrich

The Web Is At A Crossroads - New Standard Enables Copyright Enforcement Violating Users... - 0 views

  • “Institutional standards should not contain elements pushed in by lobbies, since they are detrimental to public interests. Of course lobbies have financial and political means to ignore or distort standards in their products, but they want more. T
  •  
    technical standards EME
Carsten Ullrich

Facebook Publishes Enforcement Numbers for the First Time | Facebook Newsroom - 0 views

  • 86% of which was identified by our technology before it was reported to Facebook.
  • For hate speech, our technology still doesn’t work that well and so it needs to be checked by our review teams. We removed 2.5 million pieces of hate speech in Q1 2018 — 38% of which was flagged by our technology.
  • addition, in many areas — whether it’s spam, porn or fake accounts — we’re up against sophisticated adversaries who continually change tactics to circumvent our controls,
Carsten Ullrich

Facebook is stepping in where governments won't on free expression - Wendy H. Wong and ... - 0 views

  • The explicit reference to human rights in its charter acknowledges that companies have a role in protecting and enforcing human rights.
  • This is consistent with efforts by the United Nations and other advocacy efforts to create standards on how businesses should be held accountable for human rights abuses. In light of Facebook’s entanglement in misinformation, scandals and election falsehoods, as well as genocide and incitement of violence, it seems particularly pertinent for the company.
  • To date, we have assigned such decision-making powers to states, many of which are accountable to their citizens. Facebook, on the other hand, is unaccountable to citizens in nations around the world, and a single individual (Mark Zuckerberg) holds majority decision-making power at the company.
  • ...6 more annotations...
  • In other cases, human moderators have had their decisions overturned. The Oversight Board also upheld Facebook’s decision to remove a dehumanizing ethnic slur against Azerbaijanis in the context of an active conflict over the Nagorno-Karabakh disputed region.
  • But Facebook and other social media companies do not have to engage in a transparent, publicly accountable process to make their decisions. However, Facebook claims that in its decision-making, it upholds the human right of freedom of expression. However, freedom of expression does not mean the same thing to everyone
  • rivate organizations are currently the only consistent governors of data and social media.
  • However, the Oversight Board deals with only a small fraction of possible cases.
  • Facebook’s dominance in social media, however, is notable not because it’s a private company. Mass communication has been privatized, at least in the U.S., for a long time. Rather, Facebook’s insertion into the regulation of freedom of expression and its claim to support human rights is notable because these have traditionally been the territory of governments. While far from perfect, democracies provide citizens and other groups influence over the enforcement of human rights.
  • Facebook and other social media companies, however, have no such accountability to the public. Ensuring human rights needs to go beyond volunteerism by private companies. Perhaps with the Australia versus Facebook showdown, governments finally have an impetus to pay attention to the effects of technology companies on fundamental human rights.
Carsten Ullrich

The Next Wave of Platform Governance - Centre for International Governance Innovation - 0 views

  • he shift from product- and service-based to platform-based business creates a new set of platform governance implications — especially when these businesses rely upon shared infrastructure from a small, powerful group of technology providers (Figure 1).
  • The industries in which AI is deployed, and the primary use cases it serves, will naturally determine the types and degrees of risk, from health and physical safety to discrimination and human-rights violations. Just as disinformation and hate speech are known risks of social media platforms, fatal accidents are a known risk of automobiles and heavy machinery, whether they are operated by people or by machines. Bias and discrimination are potential risks of any automated system, but they are amplified and pronounced in technologies that learn, whether autonomously or by training, from existing data.
  • Business Model-Specific Implications
  • ...7 more annotations...
  • The implications of cloud platforms such as Salesforce, Microsoft, Apple, Amazon and others differ again. A business built on a technology platform with a track record of well-developed data and model governance, audit capability, responsible product development practices and a culture and track record of transparency will likely reduce some risks related to biased data and model transparency, while encouraging (and even enforcing) adoption of those same practices and norms throughout its ecosystem.
  • policies that govern their internal practices for responsible technology development; guidance, tools and educational resources for their customers’ responsible use of their technologies; and policies (enforced in terms of service) that govern the acceptable use of not only their platforms but also specific technologies, such as face recognition or gait detection.
  • At the same time, overreliance on a small, well-funded, global group of technology vendors to set the agenda for responsible and ethical use of AI may create a novel set of risks.
  • Audit is another area that, while promising, is also fraught with potential conflict. Companies such as O’Neil Risk Consulting and Algorithmic Auditing, founded by the author of Weapons of Math Destruction, Cathy O’Neil, provide algorithmic audit and other services intended to help companies better understand and remediate data and model issues related to discriminatory outcomes. Unlike, for example, audits of financial statements, algorithmic audit services are as yet entirely voluntary, lack oversight by any type of governing board, and do not carry disclosure requirements or penalties. As a result, no matter how thorough the analysis or comprehensive the results, these types of services are vulnerable to manipulation or exploitation by their customers for “ethics-washing” purposes.
  • , we must broaden our understanding of platforms beyond social media sites to other types of business platforms, examine those risks in context, and approach governance in a way that accounts not only for the technologies themselves, but also for the disparate impacts among industries and business models.
  • This is a time-sensitive issue
  • arge technology companies — for a range of reasons — are trying to fill the policy void, creating the potential for a kind of demilitarized zone for AI, one in which neither established laws nor corporate policy hold sway.
Carsten Ullrich

Algorithm Transparency: How to Eat the Cake and Have It Too - European Law Blog - 0 views

  • While AI tools still exist in a relative legal vacuum, this blog post explores: 1) the extent of protection granted to algorithms as trade secrets with exceptions of overriding public interest; 2) how the new generation of regulations on the EU and national levels attempt to provide algorithm transparency while preserving trade secrecy; and 3) why the latter development is not a futile endeavour. 
  • most complex algorithms dominating our lives (including those developed by Google and Facebook), are proprietary, i.e. shielded as trade secrets, while only a negligible minority of algorithms are open source. 
  • Article 2 of the EU Trade Secrets Directive
  • ...11 more annotations...
  • However, the protection granted by the Directive is not absolute. Article 1(2)(b), bolstered by Recital 11, concedes that secrecy will take a back seat if the ‘Union or national rules require trade secret holders to disclose, for reasons of public interest, information, including trade secrets, to the public or to administrative or judicial authorities for the performance of the duties of those authorities’. 
  • With regard to trade secrets in general, in the Microsoft case, the CJEU held that a refusal by Microsoft to share interoperability information with a competitor constituted a breach of Article 102 TFEU.
  • Although trade secrets remained protected from the public and competitors, Google had to disclose Page Rank parameters to the Commission as the administrative authority for the performance of its investigative duties. It is possible that a similar examination will take place in the recently launched probe in Amazon’s treatment of third-party sellers. 
  • For instance, in February 2020, the District Court of the Hague held that the System Risk Indication algorithm that the Dutch government used to detect fraud in areas such as benefits, allowances, and taxes, violated the right to privacy (Article 8 ECHR), inter alia, because it was not transparent enough, i.e. the government has neither publicized the risk model and indicators that make up the risk model, nor submitted them to the Court (para 6 (49)).
  • Article 22 still remains one of the most unenforceable provisions of the GDPR. Some scholars (see, e.g. Wachter) question the existence of such a right to explanation altogether claiming that if the right does not withstand the balancing against trade secrets, it is of little value.
  • In 2019, to ensure competition in the platform economy, the European Parliament and the Council adopted Platform-to-Business (P2B) Regulation. To create a level playing field between businesses, the Regulation for the first time mandates the platforms to disclose to the businesses the main parameters of the ranking systems they employ, i.e. ‘algorithmic sequencing, rating or review mechanisms, visual highlights, or other saliency tools’ while recognising the protection of algorithms by the Trade Secrets Directive (Article 1(5)).
  • The recent Guidelines on ranking transparency by the European Commission interpret the ‘main parameters’ to mean ‘what drove the design of the algorithm in the first place’ (para 41).
  • The German Interstate Media Law that entered into force in October 2020, transposes the revised Audio-Visual Services Directive, but also goes well beyond the Directive in tackling automated decision-making that leads to prioritization and recommendation of content.
  • This obligation to ‘explain the algorithm’ makes it the first national law that, in ensuring fairness for all journalistic and editorial offers, also aims more generally at diversity of opinion and information in the digital space – a distinct human rights dimension. If the provision proves enforceable, it might serve as an example for other Member States to emulate. 
  • Lastly, the draft DSA grants the newly introduced Digital Service Coordinators, the Commission, as well as vetted researchers (under conditions to be specified) the powers of data access to ensure compliance with the DSA. The core of this right, however, is undermined in Article 31(6), which effectively allows the platforms to refuse such access based on trade secrecy concerns. 
  • This shows that although addressing algorithms in a horizontal instrument is a move in the right direction, to make it enforceable, the final DSA, as well as any ensuing guidelines, should differentiate between three tiers of disclosure: 1) full disclosure – granting supervisory bodies the right of access, which may not be refused by the IP owners, to all confidential information; 2) limited disclosure – granting vetted researchers the right of access limited in time and scope, with legal guarantees for protection of trade secrecy; and 3) explanation of main parameters – granting individuals information in accessible language without prejudice to trade secrets. 
Carsten Ullrich

Article - 0 views

  • Entwurf für ein Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität
  • oviders of commercial telemedia services and associated contributors and intermediaries will, in future, be subject to the same information obligations as telecommunications services. A new Article 15a TMG obliges them to disclose information about their users’ inventory data if requested by the Federal Office for the Protection of the Constitution, law enforcement or police authorities, the Militärische Abschirmdienst (Military Counterintelligence Service), the Bundesnachrichtendienst (Federal Intelligence Service) or customs authorities
  • To this end, they are required, at their own expense, to make arrangements for the disclosure of such information within their field of responsibility. Services with over 100 000 customers must also provide a secure electronic interface for this purpose.
  • ...2 more annotations...
  • Social network providers, meanwhile, are subject to proactive reporting obligations
  • The provider must check whether this is the case and report the content immediately, as well as provide the IP address and port number of the person responsible. The user “on whose behalf the content was stored” should be informed that the information has been passed on to the BKA, unless the BKA orders otherwise.
Carsten Ullrich

Council of Europe - ETS No. 185 - Convention on Cybercrime - 0 views

  • Recognising the need for co-operation between States and private industry
  • need to protect legitimate interests
  • roper balance between the interests of law enforcement and respect for fundamental human rights
  • ...11 more annotations...
  • right to freedom of expression, including the freedom to seek, receive, and impart information and ideas of all kinds, regardless of frontiers, and the rights concerning the respect for privacy;
  • United Nations, the OECD
  • European Union and the G8
  • establish as criminal offences under its domestic law,
  • producing child pornography
  •   offering or making available child pornography
  • distributing or transmitting
  • procuring
  • possessing
  • expeditious preservation of traffic data is available
  • expeditious disclosure to the Party’s competent authority,
Carsten Ullrich

CJEU in UPC Telekabel Wien: A totally legal court order...to do the impossible - Kluwer... - 0 views

  • Accordingly, UPC was instructed to do everything that could possibly and reasonably be expected of it to block kino.to. Whether all reasonable measures were taken was to be reviewed only in a subsequent “enforcement process”
  • he Court identified a three-way conflict between:  a) copyright and related rights; b) the intermediary’s right to conduct a business; and c) the freedom of information of internet users. It repeated its Promusicae conclusion that where several fundamental rights are at stake, a fair balance must be struck between the requirements of all. The Court found that the injunctive order under consideration struck the right balance.
  • intermediaries must be careful not to infringe users’ freedom of information
  • ...12 more annotations...
  • with regard to copyright protection, the Court stressed that a complete cessation of infringements might not be possible or achievable in practice
  • this does not pose a problem, given that, as previously emphasised in the Court’s case law, there is nothing whatsoever in Article 17(2) of the Charter to suggest that intellectual property is inviolable and must be absolutely protected
  • According to the Court, internet access providers must make sure that both right-holders and users are kept happy, with no real guidance as to what measures might achieve that effect.
  • “figuring out what content is legal against what content is infringing is too hard for us poor lawyers and judges!”
  • the two SABAM cases, which found filtering incompatible with fundamental rights, by confirming that specific (in the sense of “targeted at a clearly indicated website”) blocking injunctions are permissible, as long as they do not unreasonably infringe users’ rights.
  • act explicitly redirects the balancing exercise to a private enterprise and defers the assessment of its outcome to a later procedure.
  • SP has no real way of knowing what is and what is not “reasonable” in the eyes of the law.
  • . It’ll be reasonable, the Court seems to say, as long as it’s not entirely ineffective, or at least tries to not be entirely ineffective, or at least suggests that users shouldn’t do this
  • . Indeed, in a recent Dutch case, the court of appeal of The Hague overturned an injunction ordering access providers ZIGGO and XS4ALL to block the well-known torrenting site The Pirate Bay, after studies confirmed no effect at all on the number of downloads from illegal sources.
  • nsisting that a symbolic “do something” gesture must be made to establish that the intermediary is opposed to piracy, even if it cannot achieve real results.
  • UK’s Justice Arnold in EMI Records v British Sky Broadcasting
  • guidelines assessing the proportionality of blocking measures be laid down by the CJEU – that would have been welcome indeed!
  •  
    UPC Telekabel Wien
Carsten Ullrich

The IPKat: France: costs of blocking injunctions to be borne by internet intermediaries - 0 views

  • Why? Because (a) everybody has to chip in the fight against piracy - that includes ISPs and IBPs - and (b) because ISPs and IBPs make profit from letting users access infringing sites, and can afford to cover such costs whereas right holders may not. As such, bearing the full costs of injunctions is no 'unbearable sacrifice' in the meaning of the CJEU's Telekabel jurisprudence. 
  • The unions had asked the ISP/IBPs to block and de-list four websites providing access to protected material via streaming and/or downloading: www.allostreaming.com, www.allowshowtv.com, www.allomovies.com and www.alloshare.com.
  • The claimants also applied for the costs of the injunctions to be covered by ISP/IBPs in their entirety because they were not in the position to sustain these measures financially.
  • ...9 more annotations...
  • The Appeal Court based its decision on the fact that right holders' unions and societies were financially unable to cover the costs of injunctions, whilst ISP/IBPs were.
  • he appeal decision went further by stressing that their order was also justified by fact that the defendants generated profits from internet users accessing the infringing websites. As a result, the Court breached ISP/IBPs' freedom to conduct business (as protected by Articles 16 and 52(2) of the Charter of Fundamental Rights of the European Union).
  • Nevertheless, the Supreme Court insisted that the judiciary had jurisdiction to require of ISP/IBPs to perform any necessary measures against copyright infringement on the internet, thanks to the 2000 Directive on electronic commerce and the 2001 InfoSoc Directive (tranposed into national law under Article 6-1-8 of the 2004 'LCEN' Act). The Court held that the dispositions provided a lawful basis to have the costs of injunctions charged against ISP/IBPs. This is because as "technical intermediaries" ISP/IBPs are  "best placed to bring such infringing activities to an end", the Court say, quoting the words of the InfoSoc Directive (Recital 59) directly. 
  • . First, it confirmed that neither ISPs nor IBPs were liable for secondary infringement so long as they had no knowledge of the infringing activities or that they acted sufficiently promptly to put an end to the known illegal acts upon notification by right holders. Second, the Supreme Court reasserted that ISP/IBPs were under no statutory obligation to undertake surveillance work of internet users.
  • The Supreme Court judges see nothing under EU law that would prevent national courts from attributing all costs to intermediaries.
  • "despite their non-liability, access and hosting providers are legally bound to contribute to the fight against illicit material and, more specifically, against the infringement of authors' and neighboring rights" ; "...[O]n the basis of the pure point of law, the decision of the Court of Appeal was legally justified". 
  • on the other hand, that neither ISPs nor IBPs demonstrated that the performance of the measures would represent an unbearable sacrifice, or that their costs would endanger their economic viability
  • It is very interesting to see French Courts give so much weight to the financial situation of the parties and the (alleged or potential) revenues generated by ISP/IBPs from infringing websites, in their application of liability rules. Indeed, the latter are usually framed as pure questions of law, disconnected from economic realities.
  • We will have to wait to see whether the position of the French court catches on in other jurisdictions, or not.
1 - 12 of 12
Showing 20 items per page