Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged legislation

Rss Feed Group items tagged

Carsten Ullrich

Digital Services Act: Ensuring a trustworthy and safe online environment while allowing... - 0 views

  • The EU’s overall objectives are certainly well-intended. However, many concerns remain, for instance:
  • The DSA should tackle bad players and behaviours regardless of the platform’s size and country of origin. Having a specific regime for “very large online platforms” with additional obligations leaves the door open for rogue players to simply move to smaller digital service providers that are subject to a lighter regime.
  • To prevent legal uncertainty, the DSA should have a clear scope focusing on illegal content, products and services. The rules should be horizontal and principle-based, and could in a second phase be complemented with more targeted measures (legislative and non-legislative) to tackle specific concerns. 
  • ...3 more annotations...
  • Undermining the ‘country of origin’ principle would fragment the EU Single Market and create more red tape for national businesses trying to become European businesses.
  • While well-intended, EU policymakers should find the appropriate equilibrium between transparency, the protection against rogue players’ attempts to game the system, and the protection of operators’ trade secrets. Any new requirement must be achievable, proportionate to known risks and provide real added value.
  • To prevent legal uncertainty, the DSA should have a clear scope focusing on illegal content, products and services. The rules should be horizontal and principle-based, and could in a second phase be complemented with more targeted measures (legislative and non-legislative) to tackle specific concerns. 
Carsten Ullrich

Happy Birthday: The E-Commerce Directive Turns 20 - Disruptive Competition Project - 0 views

  • o be as effective as the ECD, the DSA should be a horizontal principle-based legislative initiative, which could be complemented by targeted measures (legislative and non-legislative) tackling specific concerns. 
Carsten Ullrich

EUR-Lex - COM:2017:795:FIN - EN - EUR-Lex - 0 views

  • . In e-commerce in particular, market surveillance authorities have great difficulty tracing non-compliant products imported into the Union and identifying the responsible entity within their jurisdiction.
  • In its 2017 work programme 4 , the Commission announced an initiative to strengthen product compliance and enforcement Union harmonisation legislation on products, as part of the 'Goods Package'. The initiative is to address the increasing amount of non-compliant products on the Union market while offering incentives to boost regulatory compliance and ensuring fair and equal treatment that will benefit of businesses and citizens.
  • The development of e-commerce is also due to a great extent to the proliferation of information society service providers, normally through platforms and for remuneration, which offer intermediary services by storing third party content, but without exercising any control over such content, thus not acting on behalf of an economic operator. Removal of content regarding non-compliant products or where it is not feasible blocking access to non-compliant products offered through their services should be without prejudice to the rules laid down in Directive 2000/31/EC of the European Parliament and of the Council 55 . In particular, no general obligation should be imposed on service providers to monitor the information which they transmit or store, nor should a general obligation be imposed upon them to actively seek facts or circumstances indicating illegal activity. Furthermore, hosting service providers should not be held liable as long as they do not have actual knowledge of illegal activity or information and are not aware of the facts or circumstances from which the illegal activity or information is apparent.
  • ...4 more annotations...
  • Compliance rates by Member State/sectors and for e-commerce and imports (improvements in availability and quality of information in Member State enforcement strategies, progress in reduction of compliance gaps)
  • Those powers should be sufficiently robust to tackle the enforcement challenges of Union harmonisation legislation, along with the challenges of e-commerce and the digital environment and to prevent economic operators from exploiting gaps in the enforcement system by relocating to Member States whose market surveillance authorities are not equipped to tackle unlawful practices. In particular, the powers should ensure that information and evidence can be exchanged between competent authorities so that enforcement can be undertaken equally in all Member States.
  • (3) low deterrence of the current enforcement tools, notably with respect to imports from third countries and e-commerce
  • (4) important information gaps (i.e. lack of awareness of rules by businesses and little transparency as regards product compliance)
Carsten Ullrich

CG v Facebook Ireland Ltd & Anor [2016] NICA 54 (21 December 2016) - 0 views

  • The commercial importance of ISS providers is recognised in Recital 2 of the Directive which notes the significant employment opportunities and stimulation of economic growth and investment in innovation from the development of electronic commerce. The purpose of the exemption from monitoring is to make the provision of the service practicable and to facilitate the opportunities for commercial activity. The quantities of information described by the learned trial judge at paragraph [19] of his judgment explain why such a provision is considered necessary. Although the 2002 Regulations do not contain a corresponding provision they need to be interpreted with the monitoring provision in mind.
  • Given the quantities of information generated the legislative steer is that monitoring is not an option
  • he judge concluded that the existence of the XY litigation was itself sufficient to fix Facebook with actual knowledge of unlawful disclosure of information on Predators 2 or awareness of facts and circumstances from which it would have been apparent that the publication of the information constituted misuse of private information. In our view such a liability could only arise if Facebook was subject to a monitoring obligation
Carsten Ullrich

Article - 0 views

  • On 6 February 2020, the audiovisual regulator of the French-speaking community of Belgium (Conseil supérieur de l’audiovisuel – CSA) published a guidance note on the fight against certain forms of illegal Internet content, in particular hate speech
  • In the note, the CSA begins by summarising the current situation, highlighting the important role played by content-sharing platforms and their limited responsibility. It emphasises that some content can be harmful to young people in particular, whether they are the authors or victims of the content. It recognises that regulation, in its current form, is inappropriate and creates an imbalance between the regulation of online content-sharing platform operators, including social networks, and traditional players in the audiovisual sector
  • ould take its own legislative measures without waiting for work to start on an EU directive on the subject. 
  • ...6 more annotations...
  • f it advocates crimes against humanity; incites or advocates terrorist acts; or incites hatred, violence, discrimination or insults against a person or a group of people on grounds of origin, alleged race, religion, ethnic background, nationality, gender, sexual orientation, gender identity or disability, whether real or alleged.
  • obligations be imposed on the largest content-sharing platform operators, that is, any natural or legal person offering, on a professional basis, whether for remuneration or not, an online content-sharing platform, wherever it is based, used by at least 20% of the population of the French-speaking region of Belgium or the bilingual Brussels-Capital region.
  • iged to remove or block content notified to them that is ‘clearly illegal’ within 24 hours. T
  • need to put in place reporting procedures as well as processes for contesting their decisions
  • appoint an official contact person
  • half-yearly report on compliance with their obligation
Carsten Ullrich

Digital Economy Act 2010 - 0 views

  •  
    Sections 17 and 18
Carsten Ullrich

Upload filters, copyright and magic pixie dust - Copybuzz - 0 views

  • At the heart of the initiative is a plan for online platforms to “increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.” Significantly, the ideas are presented as “guidelines and principles”. That’s because they are entirely voluntary. Except that the Commission makes it quite clear that if this totally voluntary system is not implemented by companies like Facebook and Google, it will bring in new laws to make them do it on a not-so-voluntary basis. The Commission is quite eager to see swift results from these voluntary efforts, as legislative proposals could already be on the table by May 2018.
  • But the worst idea, and one that appears multiple times in the latest plans, is the routine and pervasive use of upload filters.
  • In doing so, they have caused notable collateral damage, especially to fundamental rights.
  • ...3 more annotations...
  • The European Commission is well aware that Article 15 of the E-Commerce Directive explicitly prohibits Member States from imposing “a general obligation on providers … to monitor the information which they transmit or store, [or] a general obligation actively to seek facts or circumstances indicating illegal activity.
  • does indeed involve a “general obligation” on those companies to filter all uploads for a vast range of “illegal content”
  • That lack of good faith makes the Commission’s stubborn insistence on a non-existent technical solution to a non-existent problem even more frustrating. If it had the courage to admit the truth about the unproblematic nature of unauthorised sharing of copyright materials, it wouldn’t need to come up with unhelpful approaches like upload filters that are certain to cause immense harm to both the online world and to the EU’s Digital Single Market.
Carsten Ullrich

CopyCamp Conference Discusses Fallacies Of EU Copyright Reform Amid Ideas For Copy Chan... - 0 views

  • Beyond the potential negative economic aspects, several speakers at the Copycamp conference rang the alarm bells over the potential fallout of round-the-clock obligatory monitoring and filtering of user content on the net. Diego Naranjo from the European Digital Rights initiative (EDRi) reported: “I heard one of the EU member state representatives say, ‘Why do we use this (filtering system) only for copyright?’,” he said. The idea of bringing down the unauthorised publication of copyrighted material by algorithm was “a very powerful tool in the hands of government,” he warned.
  • In contrast to the dark picture presented by many activists on copyright, multi-purpose filtering machines and the end of ownership in the time of the internet of things, chances for reform are presented for various areas of rights protection.
  • EU copyright reform itself is a chance, argued Raegan MacDonalds from the Mozilla Foundation, calling it “the opportunity of a generation to bring copyright in line with the digital age, and we want to do that.” Yet the task, like in earlier copyright legislative processes, is to once more expose what she described as later dismantled myths of big rights holders, that any attempt to harmonise exceptions would kill their industry.
Carsten Ullrich

American Internet, American Platforms, American Values - Centre for International Gover... - 0 views

  • Non-Americans should not be satisfied with this state of affairs, which basically amounts to Americans fighting with other Americans about how to run the world.
    • Carsten Ullrich
       
      !!!
  • that is, the idea that people should have a say in the rules that govern their activities. The Manila Principles, moreover, place an inordinate emphasis on domestic courts to regulate platforms, even though, as my co-author Keller notes, courts lack the expertise and policy-making capacity to do so.
  • What all of these proposals have in common, beyond adopting the American free-speech debate as their starting point, is that they treat these large platforms as an unalterable fact of life. They consider the main question to be not whether these platforms should be making decisions for billions of non-Americans, but how they should make these decisions.
  • ...10 more annotations...
  • he democratic right for non-Americans to determine the rules under which we should live is not even considered. Instead, attempts by democratic governments to impose legitimate democratic regulation on these companies, many of which have assumed the status of essential infrastructure, is derided as creeping authoritarianism or as a threat to the free and open internet.
  • At the very least, thinking of internet governance in these terms should make us more sympathetic to attempts by the Australian, Canadian, German and United Kingdom governments to legislate in this area, rather than be dismissive of the legitimacy of (democratic) governance on its face. If we value democratic oversight, state regulation is almost the only game in town, an approach that can be complemented with international treaty-making among democratic states so as to create agreed-upon minimum standards for regulating cross-border platform activities.
  • o address the first question, in a sense, the global American platforms are free riders on the notion that the internet as a network should be global in reach. Here, a useful analogy is the global financial system. Although we have a global financial system, it is characterized by domestic regulation and, in many countries
  • many of the social harms perpetuated by platforms are the likely result of their business models, which incentivize extremist speech and pervasive surveillance
  • Speech regulation without addressing these root causes is unlikely to be successful. If tools such as internet search functions truly have become essential to knowledge discovery and exhibit natural monopoly characteristics, countries should have the ability to determine for themselves what form they should take. To be blunt, public ownership should be on the table, even if it isn’t, currently, in the United States.
  • Google’s threat (which mirrored Facebook’s) to cut off its search service to Australia was likely due as much, if not more, to Australia’s plan to exercise oversight over its proprietary algorithm than it was about Australia’s plan to force Google to give a cut of its revenues to various Australian media outlets. The harshness of this threat highlights exactly how hard it will be for non-US countries to exert any meaningful control over the services currently monopolized by these US companies.
  • Already, the United States, as the home of these companies, is working to solidify the market and social dominance of its platforms.
  • As already mentioned, the CUSMA contains provisions protecting free cross-border data flows that, while justified in terms of encouraging trade, serve to preserve the dominance of the US platforms in Canada and Mexico. To this, we can add its successful inclusion of CDA Section 230 language in the agreement, effectively pre-empting Canadian and Mexican debates over what values we wish to apply to platform governance.
  • he first step to coming up with a sound policy involves understanding the policy terrain. In internet governance, and particularly in platform governance, this involves understanding the extent to which the dominant debates and landscape reflect particular US interests and values
  • hese interests and values do not necessarily reflect those of people living in other countries. Both Canadians and Americans believe in free speech and market competition. However, our interpretations of the limits of each differ. This reality — the acknowledgement of legitimate differences and the necessity of democratic accountability — should be our starting point in discussions of internet governance, not the desire to preserve a global internet and platform ecosystem that is much less global, and much more American, than it appears.
Carsten Ullrich

Online Harms: Government publishes response to consultation, Ofcom to be given powers t... - 0 views

  • A small group of companies with the largest online presence and high-risk features, which is likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1, while Category 2 services include platforms that host dating services or pornography and private messaging apps. The Government has said that less than 3% of UK businesses will fall within the scope of the legislation and the vast majority of companies that do will be Category 2 services, the UK government.
Carsten Ullrich

Systemic Duties of Care and Intermediary Liability - Daphne Keller | Inforrm's Blog - 0 views

  • ursuing two reasonable-sounding goals for platform regulation
  • irst, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal m
  • Second, they want to preserve existing immunitie
  • ...8 more annotations...
  • ystemic duty of care” is a legal standard for assessing a platform’s overall system for handling harmful online content. It is not intended to define liability for any particular piece of content, or the outcome of particular litigation disputes.
  • The basic idea is that platforms should improve their systems for reducing online harms. This could mean following generally applicable rules established in legislation, regulations, or formal guidelines; or it could mean working with the regulator to produce and implement a platform-specific plan.
  • In one sense I have a lot of sympathy for this approach
  • In another sense, I am quite leery of the duty of care idea.
  • he actions platforms might take to comply with a SDOC generally fall into two categories. The first encompasses improvements to existing notice-and-takedown systems.
  • he second SDOC category – which is in many ways more consequential – includes obligations for platforms to proactively detect and remove or demote such content.
  • Proactive Monitoring Measures
    • Carsten Ullrich
       
      this is a bit too narrow, proactivity means really a rsk based approach, nit just monitoring, but monitoring for threats and risks
  • The eCommerce Directive and DMCA both permit certain injunctions, even against intermediaries that are otherwise immune from damages. Here again, the platform’s existing capabilities – its capacity to know about and control user content – matter. In the U.K. Mosley v. Google case, for example, the claimant successfully argued that because Google already used technical filters to block illegal child sexual abuse material, it could potentially be compelled to filter the additional images at image in his case.
1 - 13 of 13
Showing 20 items per page