Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged rights

Rss Feed Group items tagged

Carsten Ullrich

Facebook is stepping in where governments won't on free expression - Wendy H. Wong and ... - 0 views

  • The explicit reference to human rights in its charter acknowledges that companies have a role in protecting and enforcing human rights.
  • This is consistent with efforts by the United Nations and other advocacy efforts to create standards on how businesses should be held accountable for human rights abuses. In light of Facebook’s entanglement in misinformation, scandals and election falsehoods, as well as genocide and incitement of violence, it seems particularly pertinent for the company.
  • To date, we have assigned such decision-making powers to states, many of which are accountable to their citizens. Facebook, on the other hand, is unaccountable to citizens in nations around the world, and a single individual (Mark Zuckerberg) holds majority decision-making power at the company.
  • ...6 more annotations...
  • In other cases, human moderators have had their decisions overturned. The Oversight Board also upheld Facebook’s decision to remove a dehumanizing ethnic slur against Azerbaijanis in the context of an active conflict over the Nagorno-Karabakh disputed region.
  • But Facebook and other social media companies do not have to engage in a transparent, publicly accountable process to make their decisions. However, Facebook claims that in its decision-making, it upholds the human right of freedom of expression. However, freedom of expression does not mean the same thing to everyone
  • rivate organizations are currently the only consistent governors of data and social media.
  • However, the Oversight Board deals with only a small fraction of possible cases.
  • Facebook’s dominance in social media, however, is notable not because it’s a private company. Mass communication has been privatized, at least in the U.S., for a long time. Rather, Facebook’s insertion into the regulation of freedom of expression and its claim to support human rights is notable because these have traditionally been the territory of governments. While far from perfect, democracies provide citizens and other groups influence over the enforcement of human rights.
  • Facebook and other social media companies, however, have no such accountability to the public. Ensuring human rights needs to go beyond volunteerism by private companies. Perhaps with the Australia versus Facebook showdown, governments finally have an impetus to pay attention to the effects of technology companies on fundamental human rights.
Carsten Ullrich

Algorithm Transparency: How to Eat the Cake and Have It Too - European Law Blog - 0 views

  • While AI tools still exist in a relative legal vacuum, this blog post explores: 1) the extent of protection granted to algorithms as trade secrets with exceptions of overriding public interest; 2) how the new generation of regulations on the EU and national levels attempt to provide algorithm transparency while preserving trade secrecy; and 3) why the latter development is not a futile endeavour. 
  • most complex algorithms dominating our lives (including those developed by Google and Facebook), are proprietary, i.e. shielded as trade secrets, while only a negligible minority of algorithms are open source. 
  • Article 2 of the EU Trade Secrets Directive
  • ...11 more annotations...
  • However, the protection granted by the Directive is not absolute. Article 1(2)(b), bolstered by Recital 11, concedes that secrecy will take a back seat if the ‘Union or national rules require trade secret holders to disclose, for reasons of public interest, information, including trade secrets, to the public or to administrative or judicial authorities for the performance of the duties of those authorities’. 
  • With regard to trade secrets in general, in the Microsoft case, the CJEU held that a refusal by Microsoft to share interoperability information with a competitor constituted a breach of Article 102 TFEU.
  • Although trade secrets remained protected from the public and competitors, Google had to disclose Page Rank parameters to the Commission as the administrative authority for the performance of its investigative duties. It is possible that a similar examination will take place in the recently launched probe in Amazon’s treatment of third-party sellers. 
  • For instance, in February 2020, the District Court of the Hague held that the System Risk Indication algorithm that the Dutch government used to detect fraud in areas such as benefits, allowances, and taxes, violated the right to privacy (Article 8 ECHR), inter alia, because it was not transparent enough, i.e. the government has neither publicized the risk model and indicators that make up the risk model, nor submitted them to the Court (para 6 (49)).
  • Article 22 still remains one of the most unenforceable provisions of the GDPR. Some scholars (see, e.g. Wachter) question the existence of such a right to explanation altogether claiming that if the right does not withstand the balancing against trade secrets, it is of little value.
  • In 2019, to ensure competition in the platform economy, the European Parliament and the Council adopted Platform-to-Business (P2B) Regulation. To create a level playing field between businesses, the Regulation for the first time mandates the platforms to disclose to the businesses the main parameters of the ranking systems they employ, i.e. ‘algorithmic sequencing, rating or review mechanisms, visual highlights, or other saliency tools’ while recognising the protection of algorithms by the Trade Secrets Directive (Article 1(5)).
  • The recent Guidelines on ranking transparency by the European Commission interpret the ‘main parameters’ to mean ‘what drove the design of the algorithm in the first place’ (para 41).
  • The German Interstate Media Law that entered into force in October 2020, transposes the revised Audio-Visual Services Directive, but also goes well beyond the Directive in tackling automated decision-making that leads to prioritization and recommendation of content.
  • This obligation to ‘explain the algorithm’ makes it the first national law that, in ensuring fairness for all journalistic and editorial offers, also aims more generally at diversity of opinion and information in the digital space – a distinct human rights dimension. If the provision proves enforceable, it might serve as an example for other Member States to emulate. 
  • Lastly, the draft DSA grants the newly introduced Digital Service Coordinators, the Commission, as well as vetted researchers (under conditions to be specified) the powers of data access to ensure compliance with the DSA. The core of this right, however, is undermined in Article 31(6), which effectively allows the platforms to refuse such access based on trade secrecy concerns. 
  • This shows that although addressing algorithms in a horizontal instrument is a move in the right direction, to make it enforceable, the final DSA, as well as any ensuing guidelines, should differentiate between three tiers of disclosure: 1) full disclosure – granting supervisory bodies the right of access, which may not be refused by the IP owners, to all confidential information; 2) limited disclosure – granting vetted researchers the right of access limited in time and scope, with legal guarantees for protection of trade secrecy; and 3) explanation of main parameters – granting individuals information in accessible language without prejudice to trade secrets. 
Carsten Ullrich

CJEU in UPC Telekabel Wien: A totally legal court order...to do the impossible - Kluwer... - 0 views

  • Accordingly, UPC was instructed to do everything that could possibly and reasonably be expected of it to block kino.to. Whether all reasonable measures were taken was to be reviewed only in a subsequent “enforcement process”
  • he Court identified a three-way conflict between:  a) copyright and related rights; b) the intermediary’s right to conduct a business; and c) the freedom of information of internet users. It repeated its Promusicae conclusion that where several fundamental rights are at stake, a fair balance must be struck between the requirements of all. The Court found that the injunctive order under consideration struck the right balance.
  • intermediaries must be careful not to infringe users’ freedom of information
  • ...12 more annotations...
  • with regard to copyright protection, the Court stressed that a complete cessation of infringements might not be possible or achievable in practice
  • this does not pose a problem, given that, as previously emphasised in the Court’s case law, there is nothing whatsoever in Article 17(2) of the Charter to suggest that intellectual property is inviolable and must be absolutely protected
  • According to the Court, internet access providers must make sure that both right-holders and users are kept happy, with no real guidance as to what measures might achieve that effect.
  • “figuring out what content is legal against what content is infringing is too hard for us poor lawyers and judges!”
  • the two SABAM cases, which found filtering incompatible with fundamental rights, by confirming that specific (in the sense of “targeted at a clearly indicated website”) blocking injunctions are permissible, as long as they do not unreasonably infringe users’ rights.
  • act explicitly redirects the balancing exercise to a private enterprise and defers the assessment of its outcome to a later procedure.
  • SP has no real way of knowing what is and what is not “reasonable” in the eyes of the law.
  • . It’ll be reasonable, the Court seems to say, as long as it’s not entirely ineffective, or at least tries to not be entirely ineffective, or at least suggests that users shouldn’t do this
  • . Indeed, in a recent Dutch case, the court of appeal of The Hague overturned an injunction ordering access providers ZIGGO and XS4ALL to block the well-known torrenting site The Pirate Bay, after studies confirmed no effect at all on the number of downloads from illegal sources.
  • nsisting that a symbolic “do something” gesture must be made to establish that the intermediary is opposed to piracy, even if it cannot achieve real results.
  • UK’s Justice Arnold in EMI Records v British Sky Broadcasting
  • guidelines assessing the proportionality of blocking measures be laid down by the CJEU – that would have been welcome indeed!
  •  
    UPC Telekabel Wien
Carsten Ullrich

The IPKat: France: costs of blocking injunctions to be borne by internet intermediaries - 0 views

  • Why? Because (a) everybody has to chip in the fight against piracy - that includes ISPs and IBPs - and (b) because ISPs and IBPs make profit from letting users access infringing sites, and can afford to cover such costs whereas right holders may not. As such, bearing the full costs of injunctions is no 'unbearable sacrifice' in the meaning of the CJEU's Telekabel jurisprudence. 
  • The unions had asked the ISP/IBPs to block and de-list four websites providing access to protected material via streaming and/or downloading: www.allostreaming.com, www.allowshowtv.com, www.allomovies.com and www.alloshare.com.
  • The claimants also applied for the costs of the injunctions to be covered by ISP/IBPs in their entirety because they were not in the position to sustain these measures financially.
  • ...9 more annotations...
  • The Appeal Court based its decision on the fact that right holders' unions and societies were financially unable to cover the costs of injunctions, whilst ISP/IBPs were.
  • he appeal decision went further by stressing that their order was also justified by fact that the defendants generated profits from internet users accessing the infringing websites. As a result, the Court breached ISP/IBPs' freedom to conduct business (as protected by Articles 16 and 52(2) of the Charter of Fundamental Rights of the European Union).
  • Nevertheless, the Supreme Court insisted that the judiciary had jurisdiction to require of ISP/IBPs to perform any necessary measures against copyright infringement on the internet, thanks to the 2000 Directive on electronic commerce and the 2001 InfoSoc Directive (tranposed into national law under Article 6-1-8 of the 2004 'LCEN' Act). The Court held that the dispositions provided a lawful basis to have the costs of injunctions charged against ISP/IBPs. This is because as "technical intermediaries" ISP/IBPs are  "best placed to bring such infringing activities to an end", the Court say, quoting the words of the InfoSoc Directive (Recital 59) directly. 
  • . First, it confirmed that neither ISPs nor IBPs were liable for secondary infringement so long as they had no knowledge of the infringing activities or that they acted sufficiently promptly to put an end to the known illegal acts upon notification by right holders. Second, the Supreme Court reasserted that ISP/IBPs were under no statutory obligation to undertake surveillance work of internet users.
  • The Supreme Court judges see nothing under EU law that would prevent national courts from attributing all costs to intermediaries.
  • "despite their non-liability, access and hosting providers are legally bound to contribute to the fight against illicit material and, more specifically, against the infringement of authors' and neighboring rights" ; "...[O]n the basis of the pure point of law, the decision of the Court of Appeal was legally justified". 
  • on the other hand, that neither ISPs nor IBPs demonstrated that the performance of the measures would represent an unbearable sacrifice, or that their costs would endanger their economic viability
  • It is very interesting to see French Courts give so much weight to the financial situation of the parties and the (alleged or potential) revenues generated by ISP/IBPs from infringing websites, in their application of liability rules. Indeed, the latter are usually framed as pure questions of law, disconnected from economic realities.
  • We will have to wait to see whether the position of the French court catches on in other jurisdictions, or not.
Carsten Ullrich

HUDOC - European Court of Human Rights - 0 views

  • Thus, the Court considers that the applicant company was in a position to assess the risks related to its activities and that it must have been able to foresee, to a reasonable degree, the consequences which these could entail. It therefore concludes that the interference in issue was “prescribed by law” within the meaning of the second paragraph of Article 10 of the Convention.
  • The Court has found that persons carrying on a professional activity, who are used to having to proceed with a high degree of caution when pursuing their occupation, can on this account be expected to take special care in assessing the risks that such activity entails
  • Thus, the Court notes that the applicant company cannot be said to have wholly neglected its duty to avoid causing harm to third parties. Nevertheless, and more importantly, the automatic word-based filter used by the applicant company failed to filter out odious hate speech and speech inciting violence posted by readers and thus limited its ability to expeditiously remove the offending comments
  • ...2 more annotations...
  • Against that background, the Chamber considered that the applicant company had been in a position to assess the risks related to its activities and that it must have been able to foresee, to a reasonable degree, the consequences which these could entail.
  • Lastly, the Court observes that the applicant company has argued (see paragraph 78 above) that the Court should have due regard to the notice-and-take-down system that it had introduced. If accompanied by effective procedures allowing for rapid response, this system can in the Court’s view function in many cases as an appropriate tool for balancing the rights and interests of all those involved. However, in cases such as the present one, where third-party user comments are in the form of hate speech and direct threats to the physical integrity of individuals, as understood in the Court’s case-law (see paragraph 136 above), the Court considers, as stated above (see paragraph 153), that the rights and interests of others and of society as a whole may entitle Contracting States to impose liability on Internet news portals, without contravening Article 10 of the Convention, if they fail to take measures to remove clearly unlawful comments without delay, even without notice from the alleged victim or from third parties.
Carsten Ullrich

Council of Europe - ETS No. 185 - Convention on Cybercrime - 0 views

  • Recognising the need for co-operation between States and private industry
  • need to protect legitimate interests
  • roper balance between the interests of law enforcement and respect for fundamental human rights
  • ...11 more annotations...
  • right to freedom of expression, including the freedom to seek, receive, and impart information and ideas of all kinds, regardless of frontiers, and the rights concerning the respect for privacy;
  • United Nations, the OECD
  • European Union and the G8
  • establish as criminal offences under its domestic law,
  • producing child pornography
  •   offering or making available child pornography
  • distributing or transmitting
  • procuring
  • possessing
  • expeditious preservation of traffic data is available
  • expeditious disclosure to the Party’s competent authority,
Carsten Ullrich

Euro Security Experts Deem 'Right to be Forgotten' Impossible | Center for Democracy & ... - 0 views

  •  
    right to be forgotten
Carsten Ullrich

IRIS Newsletter - 0 views

    • Carsten Ullrich
       
      ask Cedric for background and how it works, especially the algorithmic transparency
  • On 19 September, Google and the Association to Combat Audiovisual Piracy (Association de Lutte contre la Piraterie Audiovisuelle - “ALPA”) signed a partnership agreement aimed at effectively reinforcing copyright protection for the on-line exploitation of audiovisual works.
  • under the auspices of the National Centre for the Cnema (Centre National du Cinéma - “the CNC”
  • ...3 more annotations...
  • oogle’s video platform, YouTube, will make its content ID algorithm available to ALPA.
  • The algorithm is a tool for identifying and managing rights; ALPA will be able to apply the “block” and “follow” rules directly for any work placed on-line without the authorisation of the respective rights-holders. In this way it will be possible for rights-holders to add their works to the content ID filter and to ensure that their films and productions are not placed on YouTube without their consent. Google also undertakes to prevent its AdWords service from fraudulently buying key words for pirate streaming and downloading sites. It also undertakes to provide ALPA with financial support; the agreement is witness to its determination to contribute to the fight against piracy and to strengthen its policy of cooperation with originators and rights-holders.
  • The President of ALPA, Nicolas Seydoux, welcomed the agreement, which he said symbolised “the collapse of a wall of incomprehension” between Google and ALPA
  •  
    check with Cedric on background
Carsten Ullrich

CopyCamp Conference Discusses Fallacies Of EU Copyright Reform Amid Ideas For Copy Chan... - 0 views

  • Beyond the potential negative economic aspects, several speakers at the Copycamp conference rang the alarm bells over the potential fallout of round-the-clock obligatory monitoring and filtering of user content on the net. Diego Naranjo from the European Digital Rights initiative (EDRi) reported: “I heard one of the EU member state representatives say, ‘Why do we use this (filtering system) only for copyright?’,” he said. The idea of bringing down the unauthorised publication of copyrighted material by algorithm was “a very powerful tool in the hands of government,” he warned.
  • In contrast to the dark picture presented by many activists on copyright, multi-purpose filtering machines and the end of ownership in the time of the internet of things, chances for reform are presented for various areas of rights protection.
  • EU copyright reform itself is a chance, argued Raegan MacDonalds from the Mozilla Foundation, calling it “the opportunity of a generation to bring copyright in line with the digital age, and we want to do that.” Yet the task, like in earlier copyright legislative processes, is to once more expose what she described as later dismantled myths of big rights holders, that any attempt to harmonise exceptions would kill their industry.
Carsten Ullrich

The Web Is At A Crossroads - New Standard Enables Copyright Enforcement Violating Users... - 0 views

  • “Institutional standards should not contain elements pushed in by lobbies, since they are detrimental to public interests. Of course lobbies have financial and political means to ignore or distort standards in their products, but they want more. T
  •  
    technical standards EME
Carsten Ullrich

European regulation of video-sharing platforms: what's new, and will it work? | LSE Med... - 0 views

  • his set of rules creates a novel regulatory model
  • Again, leaving regulatory powers to a private entity without any public oversight is clearly not the right solution. But this is also not what, in my opinion, the new AVMSD does
  • But without transparency and information about individual cases, you surely can’t say whether the takedowns are really improving the media environment, or the providers are just trying to get rid of any controversial content – or, indeed, the content somebody just happens to be complaining about.
  • ...4 more annotations...
  • he regulator, on the other hand, has a more detached role, when compared to older types of media regulation, in which they mainly assess whether mechanisms established by the provider comply with the law
  • This approach gives rise to concerns that we are just outsourcing regulation to private companies.
  • Indeed, the delegation of the exercise of regulatory powers to a private entity could be very damaging to freedom of speech and media.
  • So, I think the legal groundwork for protection but also the fair treatment of users is in the directive. Now it depends on the member states to implement it in such a way that this potential will be fulfilled (and the European Commission has a big role in this process).
Carsten Ullrich

United Kingdom | OpenNet Initiative - 0 views

  • The U.K., together with the United States, was ranked as one of the worst offenders against individual privacy rights in the democratic world by Privacy International for 2007.52
  • Moreover, certain filtering and tracking practices do take place.
  • he U.K. government, however, has to ensure that blocking practices do not lead to abuse in the absence of external and independent control.
Carsten Ullrich

Problems with Filters in the European Commission's Platforms Proposal - Daphne Keller |... - 0 views

  • ey are shockingly expensive – YouTube’s ContentID had cost Google $60 million as of several years ago – so only incumbents can afford them. Start-ups forced to build them won’t be able to afford it, or will build lousy ones with high error rates. Filters address symptoms and leave underlying problems to fester – like, in the case of radical Islamist material, the brutal conflict in Syria, global refugee crisis, and marginalization of Muslim immigrants to the US and Europe. All these problems make filters incredibly hard to justify without some great demonstrated upside – but no one has demonstrated such a thing.
  • The DMCA moves literally billions of disputes about online speech out of courts and into the hands of private parties.
  • That allocative choice was reasonable in 1998, and it remains reasonable in 2016.
    • Carsten Ullrich
       
      I dont think so.
  • ...1 more annotation...
  • The Internet has grown exponentially in size since the DMCA was enacted, but we should not forget that the problem of large-scale infringement was an expected development—and one that the safe harbors were specifically designed to manage.
    • Carsten Ullrich
       
      any proof for that assertion?
Carsten Ullrich

Upload filters, copyright and magic pixie dust - Copybuzz - 0 views

  • At the heart of the initiative is a plan for online platforms to “increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.” Significantly, the ideas are presented as “guidelines and principles”. That’s because they are entirely voluntary. Except that the Commission makes it quite clear that if this totally voluntary system is not implemented by companies like Facebook and Google, it will bring in new laws to make them do it on a not-so-voluntary basis. The Commission is quite eager to see swift results from these voluntary efforts, as legislative proposals could already be on the table by May 2018.
  • But the worst idea, and one that appears multiple times in the latest plans, is the routine and pervasive use of upload filters.
  • In doing so, they have caused notable collateral damage, especially to fundamental rights.
  • ...3 more annotations...
  • The European Commission is well aware that Article 15 of the E-Commerce Directive explicitly prohibits Member States from imposing “a general obligation on providers … to monitor the information which they transmit or store, [or] a general obligation actively to seek facts or circumstances indicating illegal activity.
  • does indeed involve a “general obligation” on those companies to filter all uploads for a vast range of “illegal content”
  • That lack of good faith makes the Commission’s stubborn insistence on a non-existent technical solution to a non-existent problem even more frustrating. If it had the courage to admit the truth about the unproblematic nature of unauthorised sharing of copyright materials, it wouldn’t need to come up with unhelpful approaches like upload filters that are certain to cause immense harm to both the online world and to the EU’s Digital Single Market.
Carsten Ullrich

The Next Wave of Platform Governance - Centre for International Governance Innovation - 0 views

  • he shift from product- and service-based to platform-based business creates a new set of platform governance implications — especially when these businesses rely upon shared infrastructure from a small, powerful group of technology providers (Figure 1).
  • The industries in which AI is deployed, and the primary use cases it serves, will naturally determine the types and degrees of risk, from health and physical safety to discrimination and human-rights violations. Just as disinformation and hate speech are known risks of social media platforms, fatal accidents are a known risk of automobiles and heavy machinery, whether they are operated by people or by machines. Bias and discrimination are potential risks of any automated system, but they are amplified and pronounced in technologies that learn, whether autonomously or by training, from existing data.
  • Business Model-Specific Implications
  • ...7 more annotations...
  • The implications of cloud platforms such as Salesforce, Microsoft, Apple, Amazon and others differ again. A business built on a technology platform with a track record of well-developed data and model governance, audit capability, responsible product development practices and a culture and track record of transparency will likely reduce some risks related to biased data and model transparency, while encouraging (and even enforcing) adoption of those same practices and norms throughout its ecosystem.
  • policies that govern their internal practices for responsible technology development; guidance, tools and educational resources for their customers’ responsible use of their technologies; and policies (enforced in terms of service) that govern the acceptable use of not only their platforms but also specific technologies, such as face recognition or gait detection.
  • At the same time, overreliance on a small, well-funded, global group of technology vendors to set the agenda for responsible and ethical use of AI may create a novel set of risks.
  • Audit is another area that, while promising, is also fraught with potential conflict. Companies such as O’Neil Risk Consulting and Algorithmic Auditing, founded by the author of Weapons of Math Destruction, Cathy O’Neil, provide algorithmic audit and other services intended to help companies better understand and remediate data and model issues related to discriminatory outcomes. Unlike, for example, audits of financial statements, algorithmic audit services are as yet entirely voluntary, lack oversight by any type of governing board, and do not carry disclosure requirements or penalties. As a result, no matter how thorough the analysis or comprehensive the results, these types of services are vulnerable to manipulation or exploitation by their customers for “ethics-washing” purposes.
  • , we must broaden our understanding of platforms beyond social media sites to other types of business platforms, examine those risks in context, and approach governance in a way that accounts not only for the technologies themselves, but also for the disparate impacts among industries and business models.
  • This is a time-sensitive issue
  • arge technology companies — for a range of reasons — are trying to fill the policy void, creating the potential for a kind of demilitarized zone for AI, one in which neither established laws nor corporate policy hold sway.
1 - 20 of 25 Next ›
Showing 20 items per page