Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged public

Rss Feed Group items tagged

Carsten Ullrich

Algorithm Transparency: How to Eat the Cake and Have It Too - European Law Blog - 0 views

  • While AI tools still exist in a relative legal vacuum, this blog post explores: 1) the extent of protection granted to algorithms as trade secrets with exceptions of overriding public interest; 2) how the new generation of regulations on the EU and national levels attempt to provide algorithm transparency while preserving trade secrecy; and 3) why the latter development is not a futile endeavour. 
  • most complex algorithms dominating our lives (including those developed by Google and Facebook), are proprietary, i.e. shielded as trade secrets, while only a negligible minority of algorithms are open source. 
  • Article 2 of the EU Trade Secrets Directive
  • ...11 more annotations...
  • However, the protection granted by the Directive is not absolute. Article 1(2)(b), bolstered by Recital 11, concedes that secrecy will take a back seat if the ‘Union or national rules require trade secret holders to disclose, for reasons of public interest, information, including trade secrets, to the public or to administrative or judicial authorities for the performance of the duties of those authorities’. 
  • With regard to trade secrets in general, in the Microsoft case, the CJEU held that a refusal by Microsoft to share interoperability information with a competitor constituted a breach of Article 102 TFEU.
  • Although trade secrets remained protected from the public and competitors, Google had to disclose Page Rank parameters to the Commission as the administrative authority for the performance of its investigative duties. It is possible that a similar examination will take place in the recently launched probe in Amazon’s treatment of third-party sellers. 
  • For instance, in February 2020, the District Court of the Hague held that the System Risk Indication algorithm that the Dutch government used to detect fraud in areas such as benefits, allowances, and taxes, violated the right to privacy (Article 8 ECHR), inter alia, because it was not transparent enough, i.e. the government has neither publicized the risk model and indicators that make up the risk model, nor submitted them to the Court (para 6 (49)).
  • Article 22 still remains one of the most unenforceable provisions of the GDPR. Some scholars (see, e.g. Wachter) question the existence of such a right to explanation altogether claiming that if the right does not withstand the balancing against trade secrets, it is of little value.
  • In 2019, to ensure competition in the platform economy, the European Parliament and the Council adopted Platform-to-Business (P2B) Regulation. To create a level playing field between businesses, the Regulation for the first time mandates the platforms to disclose to the businesses the main parameters of the ranking systems they employ, i.e. ‘algorithmic sequencing, rating or review mechanisms, visual highlights, or other saliency tools’ while recognising the protection of algorithms by the Trade Secrets Directive (Article 1(5)).
  • The recent Guidelines on ranking transparency by the European Commission interpret the ‘main parameters’ to mean ‘what drove the design of the algorithm in the first place’ (para 41).
  • The German Interstate Media Law that entered into force in October 2020, transposes the revised Audio-Visual Services Directive, but also goes well beyond the Directive in tackling automated decision-making that leads to prioritization and recommendation of content.
  • This obligation to ‘explain the algorithm’ makes it the first national law that, in ensuring fairness for all journalistic and editorial offers, also aims more generally at diversity of opinion and information in the digital space – a distinct human rights dimension. If the provision proves enforceable, it might serve as an example for other Member States to emulate. 
  • Lastly, the draft DSA grants the newly introduced Digital Service Coordinators, the Commission, as well as vetted researchers (under conditions to be specified) the powers of data access to ensure compliance with the DSA. The core of this right, however, is undermined in Article 31(6), which effectively allows the platforms to refuse such access based on trade secrecy concerns. 
  • This shows that although addressing algorithms in a horizontal instrument is a move in the right direction, to make it enforceable, the final DSA, as well as any ensuing guidelines, should differentiate between three tiers of disclosure: 1) full disclosure – granting supervisory bodies the right of access, which may not be refused by the IP owners, to all confidential information; 2) limited disclosure – granting vetted researchers the right of access limited in time and scope, with legal guarantees for protection of trade secrecy; and 3) explanation of main parameters – granting individuals information in accessible language without prejudice to trade secrets. 
Carsten Ullrich

Facebook is stepping in where governments won't on free expression - Wendy H. Wong and ... - 0 views

  • The explicit reference to human rights in its charter acknowledges that companies have a role in protecting and enforcing human rights.
  • This is consistent with efforts by the United Nations and other advocacy efforts to create standards on how businesses should be held accountable for human rights abuses. In light of Facebook’s entanglement in misinformation, scandals and election falsehoods, as well as genocide and incitement of violence, it seems particularly pertinent for the company.
  • To date, we have assigned such decision-making powers to states, many of which are accountable to their citizens. Facebook, on the other hand, is unaccountable to citizens in nations around the world, and a single individual (Mark Zuckerberg) holds majority decision-making power at the company.
  • ...6 more annotations...
  • In other cases, human moderators have had their decisions overturned. The Oversight Board also upheld Facebook’s decision to remove a dehumanizing ethnic slur against Azerbaijanis in the context of an active conflict over the Nagorno-Karabakh disputed region.
  • However, the Oversight Board deals with only a small fraction of possible cases.
  • rivate organizations are currently the only consistent governors of data and social media.
  • But Facebook and other social media companies do not have to engage in a transparent, publicly accountable process to make their decisions. However, Facebook claims that in its decision-making, it upholds the human right of freedom of expression. However, freedom of expression does not mean the same thing to everyone
  • Facebook’s dominance in social media, however, is notable not because it’s a private company. Mass communication has been privatized, at least in the U.S., for a long time. Rather, Facebook’s insertion into the regulation of freedom of expression and its claim to support human rights is notable because these have traditionally been the territory of governments. While far from perfect, democracies provide citizens and other groups influence over the enforcement of human rights.
  • Facebook and other social media companies, however, have no such accountability to the public. Ensuring human rights needs to go beyond volunteerism by private companies. Perhaps with the Australia versus Facebook showdown, governments finally have an impetus to pay attention to the effects of technology companies on fundamental human rights.
Carsten Ullrich

Look closely at the motives of the Facebook boycotters | Financial Times - 0 views

  •  
    "The Structural Transformation of the Public Sphere"
Carsten Ullrich

European regulation of video-sharing platforms: what's new, and will it work? | LSE Med... - 0 views

  • his set of rules creates a novel regulatory model
  • Again, leaving regulatory powers to a private entity without any public oversight is clearly not the right solution. But this is also not what, in my opinion, the new AVMSD does
  • But without transparency and information about individual cases, you surely can’t say whether the takedowns are really improving the media environment, or the providers are just trying to get rid of any controversial content – or, indeed, the content somebody just happens to be complaining about.
  • ...4 more annotations...
  • he regulator, on the other hand, has a more detached role, when compared to older types of media regulation, in which they mainly assess whether mechanisms established by the provider comply with the law
  • This approach gives rise to concerns that we are just outsourcing regulation to private companies.
  • Indeed, the delegation of the exercise of regulatory powers to a private entity could be very damaging to freedom of speech and media.
  • So, I think the legal groundwork for protection but also the fair treatment of users is in the directive. Now it depends on the member states to implement it in such a way that this potential will be fulfilled (and the European Commission has a big role in this process).
Carsten Ullrich

XY v Facebook Ireland Ltd [2012] NIQB 96 (30 November 2012) - 0 views

  • [19] The Order of the Court will incorporate provision for liberty to apply. By this mechanism the Plaintiff, if necessary and if so advised, will be able to seek further relief from the Court if there is any recurrence of the offending publication. Of course, in such eventuality, it will be open to Facebook, acting responsibly and in accordance with the principles and themes clearly expressed in this judgment, to proactively take the necessary removal and closure steps.
  • [20] I refuse the Plaintiff's application for the wider form of interim injunction sought by him. This was to the effect that Facebook be required to monitor the offending webpage in order to prevent republication of the offensive material. In this respect, I prefer the argument of Mr Hopkins that such an order would lack the requisite precision, could impose a disproportionate burden and, further, would potentially require excessive supervision by the Court. See Cooperative Insurance v Argyll [1997] 3AL ER 297, pages 303 – 304, per Lord Hoffman. See also Halsbury's Laws of England, Volume 24 (Fourth Edition Reissue), paragraph 849. The propriety of granting this discrete remedy will, of course, be revisited at the substantive trial, against the backcloth of a fuller evidential matrix, which should include details of how this social networking site actually operates from day to day.
Carsten Ullrich

CG v Facebook Ireland Ltd & Anor [2016] NICA 54 (21 December 2016) - 0 views

  • The commercial importance of ISS providers is recognised in Recital 2 of the Directive which notes the significant employment opportunities and stimulation of economic growth and investment in innovation from the development of electronic commerce. The purpose of the exemption from monitoring is to make the provision of the service practicable and to facilitate the opportunities for commercial activity. The quantities of information described by the learned trial judge at paragraph [19] of his judgment explain why such a provision is considered necessary. Although the 2002 Regulations do not contain a corresponding provision they need to be interpreted with the monitoring provision in mind.
  • Given the quantities of information generated the legislative steer is that monitoring is not an option
  • he judge concluded that the existence of the XY litigation was itself sufficient to fix Facebook with actual knowledge of unlawful disclosure of information on Predators 2 or awareness of facts and circumstances from which it would have been apparent that the publication of the information constituted misuse of private information. In our view such a liability could only arise if Facebook was subject to a monitoring obligation
Carsten Ullrich

Facebook and the EU, or the failure of self-regulation | The Guest Blog - 0 views

  • How did we let this happen? Why do we appear so weak?
  • For years Brussels has been the champion of self-regulation. The dogma is – at least publicly – based on the assumption that companies know best how to tackle some of the challenges.
  • Our failure to understand the underlying challenges and a failure of regulation.
  • ...2 more annotations...
  • If it’s the latter, then we have to move away from self-regulation. We can’t continue defending self-regulation and fake outrage when what we already knew becomes public.
  • Some will shift all the blame to Facebook, but we are at least as responsible as they are. EU decision-makers let this happen with self-regulation and soft policy.
Carsten Ullrich

What Facebook isn't telling us about its fight against online abuse - Laura Bliss | Inf... - 0 views

  • In a six-month period from October 2017 to March 20178, 21m sexually explicit pictures, 3.5m graphically violent posts and 2.5m forms of hate speech were removed from its site. These figures help reveal some striking points.
  • As expected, the data indicates that the problem is getting worse.
    • Carsten Ullrich
       
      problem is getting worse - use as argument - look at facebook report
  • For instance, between January and March it was estimated that for every 10,000 messages online, between 22 and 27 contained graphic violence, up from 16 to 19 in the previous three months.
  • ...9 more annotations...
  • Here, the company has been proactive. Between January and March 2018, Facebook removed 1.9m messages encouraging terrorist propaganda, an increase of 800,000 comments compared to the previous three months. A total of 99.5% of these messages were located with the aid of advancing technology.
  • But Facebook hasn’t released figures showing how prevalent terrorist propaganda is on its site. So we really don’t know how successful the software is in this respect.
    • Carsten Ullrich
       
      we need data this would be part of my demand for standardized reporting system
  • on self-regulation,
  • Facebook has also used technology to aid the removal of graphic violence from its site.
  • But we also know that Facebook’s figures also show that up to 27 out of every 10,000 comments that made it past the detection technology contained graphic violence.
  • One estimate suggests that 510,000 comments are posted every minute. If accurate, that would mean 1,982,880 violent comments are posted every 24 hours.
  • Between the two three-month periods there was a 183% increase in the amount of posts removed that were labelled graphically violent. A total of 86% of these comments were flagged by a computer system.
  • This brings us to the other significant figure not included in the data released by Facebook: the total number of comments reported by users. As this is a fundamental mechanism in tackling online abuse, the amount of reports made to the company should be made publicly available
  • However, even Facebook still has a long way to go to get to total transparency. Ideally, all social networking sites would release annual reports on how they are tackling abuse online. This would enable regulators and the public to hold the firms more directly to account for failures to remove online abuse from their servers.
    • Carsten Ullrich
       
      my demand - standardized reporting
Carsten Ullrich

The Web Is At A Crossroads - New Standard Enables Copyright Enforcement Violating Users... - 0 views

  • “Institutional standards should not contain elements pushed in by lobbies, since they are detrimental to public interests. Of course lobbies have financial and political means to ignore or distort standards in their products, but they want more. T
  •  
    technical standards EME
Carsten Ullrich

CJEU in UPC Telekabel Wien: A totally legal court order...to do the impossible - Kluwer... - 0 views

  • Accordingly, UPC was instructed to do everything that could possibly and reasonably be expected of it to block kino.to. Whether all reasonable measures were taken was to be reviewed only in a subsequent “enforcement process”
  • he Court identified a three-way conflict between:  a) copyright and related rights; b) the intermediary’s right to conduct a business; and c) the freedom of information of internet users. It repeated its Promusicae conclusion that where several fundamental rights are at stake, a fair balance must be struck between the requirements of all. The Court found that the injunctive order under consideration struck the right balance.
  • intermediaries must be careful not to infringe users’ freedom of information
  • ...12 more annotations...
  • with regard to copyright protection, the Court stressed that a complete cessation of infringements might not be possible or achievable in practice
  • this does not pose a problem, given that, as previously emphasised in the Court’s case law, there is nothing whatsoever in Article 17(2) of the Charter to suggest that intellectual property is inviolable and must be absolutely protected
  • According to the Court, internet access providers must make sure that both right-holders and users are kept happy, with no real guidance as to what measures might achieve that effect.
  • “figuring out what content is legal against what content is infringing is too hard for us poor lawyers and judges!”
  • the two SABAM cases, which found filtering incompatible with fundamental rights, by confirming that specific (in the sense of “targeted at a clearly indicated website”) blocking injunctions are permissible, as long as they do not unreasonably infringe users’ rights.
  • act explicitly redirects the balancing exercise to a private enterprise and defers the assessment of its outcome to a later procedure.
  • SP has no real way of knowing what is and what is not “reasonable” in the eyes of the law.
  • . It’ll be reasonable, the Court seems to say, as long as it’s not entirely ineffective, or at least tries to not be entirely ineffective, or at least suggests that users shouldn’t do this
  • . Indeed, in a recent Dutch case, the court of appeal of The Hague overturned an injunction ordering access providers ZIGGO and XS4ALL to block the well-known torrenting site The Pirate Bay, after studies confirmed no effect at all on the number of downloads from illegal sources.
  • nsisting that a symbolic “do something” gesture must be made to establish that the intermediary is opposed to piracy, even if it cannot achieve real results.
  • UK’s Justice Arnold in EMI Records v British Sky Broadcasting
  • guidelines assessing the proportionality of blocking measures be laid down by the CJEU – that would have been welcome indeed!
  •  
    UPC Telekabel Wien
Carsten Ullrich

EUR-Lex - 52003DC0702 - EN - EUR-Lex - 0 views

  • Article 15 prevents Member States from imposing on internet intermediaries, with respect to activities covered by Articles 12-14, a general obligation to monitor the information which they transmit or store or a general obligation to actively seek out facts or circumstances indicating illegal activities. This is important, as general monitoring of millions of sites and web pages would, in practical terms, be impossible and would result in disproportionate burdens on intermediaries and higher costs of access to basic services for users. [73] However, Article 15 does not prevent public authorities in the Member States from imposing a monitoring obliga tion in a specific, clearly defined individual case.[73] In this context, it is important to note that the reports and studies on the effectiveness of blocking and filtering applications appear to indicate that there is not yet any technology which could not be circumvented and provide full effectiveness in blocking or filtering illegal and harmful information whilst at the same time avoiding blocking entirely legal information resulting in violations of freedom of speech.
    • Carsten Ullrich
       
      justifications mainly relate to economic viability and overblocking, but not surveillance
  •  
    justification for Article 15
Carsten Ullrich

CopyCamp Conference Discusses Fallacies Of EU Copyright Reform Amid Ideas For Copy Chan... - 0 views

  • Beyond the potential negative economic aspects, several speakers at the Copycamp conference rang the alarm bells over the potential fallout of round-the-clock obligatory monitoring and filtering of user content on the net. Diego Naranjo from the European Digital Rights initiative (EDRi) reported: “I heard one of the EU member state representatives say, ‘Why do we use this (filtering system) only for copyright?’,” he said. The idea of bringing down the unauthorised publication of copyrighted material by algorithm was “a very powerful tool in the hands of government,” he warned.
  • In contrast to the dark picture presented by many activists on copyright, multi-purpose filtering machines and the end of ownership in the time of the internet of things, chances for reform are presented for various areas of rights protection.
  • EU copyright reform itself is a chance, argued Raegan MacDonalds from the Mozilla Foundation, calling it “the opportunity of a generation to bring copyright in line with the digital age, and we want to do that.” Yet the task, like in earlier copyright legislative processes, is to once more expose what she described as later dismantled myths of big rights holders, that any attempt to harmonise exceptions would kill their industry.
Carsten Ullrich

American Internet, American Platforms, American Values - Centre for International Gover... - 0 views

  • Non-Americans should not be satisfied with this state of affairs, which basically amounts to Americans fighting with other Americans about how to run the world.
    • Carsten Ullrich
       
      !!!
  • that is, the idea that people should have a say in the rules that govern their activities. The Manila Principles, moreover, place an inordinate emphasis on domestic courts to regulate platforms, even though, as my co-author Keller notes, courts lack the expertise and policy-making capacity to do so.
  • What all of these proposals have in common, beyond adopting the American free-speech debate as their starting point, is that they treat these large platforms as an unalterable fact of life. They consider the main question to be not whether these platforms should be making decisions for billions of non-Americans, but how they should make these decisions.
  • ...10 more annotations...
  • he democratic right for non-Americans to determine the rules under which we should live is not even considered. Instead, attempts by democratic governments to impose legitimate democratic regulation on these companies, many of which have assumed the status of essential infrastructure, is derided as creeping authoritarianism or as a threat to the free and open internet.
  • At the very least, thinking of internet governance in these terms should make us more sympathetic to attempts by the Australian, Canadian, German and United Kingdom governments to legislate in this area, rather than be dismissive of the legitimacy of (democratic) governance on its face. If we value democratic oversight, state regulation is almost the only game in town, an approach that can be complemented with international treaty-making among democratic states so as to create agreed-upon minimum standards for regulating cross-border platform activities.
  • o address the first question, in a sense, the global American platforms are free riders on the notion that the internet as a network should be global in reach. Here, a useful analogy is the global financial system. Although we have a global financial system, it is characterized by domestic regulation and, in many countries
  • many of the social harms perpetuated by platforms are the likely result of their business models, which incentivize extremist speech and pervasive surveillance
  • Speech regulation without addressing these root causes is unlikely to be successful. If tools such as internet search functions truly have become essential to knowledge discovery and exhibit natural monopoly characteristics, countries should have the ability to determine for themselves what form they should take. To be blunt, public ownership should be on the table, even if it isn’t, currently, in the United States.
  • Google’s threat (which mirrored Facebook’s) to cut off its search service to Australia was likely due as much, if not more, to Australia’s plan to exercise oversight over its proprietary algorithm than it was about Australia’s plan to force Google to give a cut of its revenues to various Australian media outlets. The harshness of this threat highlights exactly how hard it will be for non-US countries to exert any meaningful control over the services currently monopolized by these US companies.
  • Already, the United States, as the home of these companies, is working to solidify the market and social dominance of its platforms.
  • As already mentioned, the CUSMA contains provisions protecting free cross-border data flows that, while justified in terms of encouraging trade, serve to preserve the dominance of the US platforms in Canada and Mexico. To this, we can add its successful inclusion of CDA Section 230 language in the agreement, effectively pre-empting Canadian and Mexican debates over what values we wish to apply to platform governance.
  • he first step to coming up with a sound policy involves understanding the policy terrain. In internet governance, and particularly in platform governance, this involves understanding the extent to which the dominant debates and landscape reflect particular US interests and values
  • hese interests and values do not necessarily reflect those of people living in other countries. Both Canadians and Americans believe in free speech and market competition. However, our interpretations of the limits of each differ. This reality — the acknowledgement of legitimate differences and the necessity of democratic accountability — should be our starting point in discussions of internet governance, not the desire to preserve a global internet and platform ecosystem that is much less global, and much more American, than it appears.
Carsten Ullrich

Broad Consequences of a Systemic Duty of Care for Platforms - Daphne Keller [Updated] |... - 0 views

  • n the up-side, flexible standards would give platforms more leeway to figure out meaningful technical improvements, and perhaps arrive at more nuanced automated assessment of content over tim
  • The down-sides of open-ended SDOC standards could be considerable, though. Proactive measures devised by platforms themselves would, even when coupled with transparency obligations, be far less subject to meaningful public review, accountability,
1 - 16 of 16
Showing 20 items per page