Skip to main content

Home/ Groups/ Duty of care + Standards _ CU
Carsten Ullrich

Broad Consequences of a Systemic Duty of Care for Platforms - Daphne Keller [Updated] |... - 0 views

  • n the up-side, flexible standards would give platforms more leeway to figure out meaningful technical improvements, and perhaps arrive at more nuanced automated assessment of content over tim
  • The down-sides of open-ended SDOC standards could be considerable, though. Proactive measures devised by platforms themselves would, even when coupled with transparency obligations, be far less subject to meaningful public review, accountability,
Carsten Ullrich

Systemic Duties of Care and Intermediary Liability - Daphne Keller | Inforrm's Blog - 0 views

  • ursuing two reasonable-sounding goals for platform regulation
  • irst, they want platforms to abide by a “duty of care,” going beyond today’s notice-and-takedown based legal m
  • Second, they want to preserve existing immunitie
  • ...8 more annotations...
  • ystemic duty of care” is a legal standard for assessing a platform’s overall system for handling harmful online content. It is not intended to define liability for any particular piece of content, or the outcome of particular litigation disputes.
  • The basic idea is that platforms should improve their systems for reducing online harms. This could mean following generally applicable rules established in legislation, regulations, or formal guidelines; or it could mean working with the regulator to produce and implement a platform-specific plan.
  • In one sense I have a lot of sympathy for this approach
  • In another sense, I am quite leery of the duty of care idea.
  • he actions platforms might take to comply with a SDOC generally fall into two categories. The first encompasses improvements to existing notice-and-takedown systems.
  • he second SDOC category – which is in many ways more consequential – includes obligations for platforms to proactively detect and remove or demote such content.
  • Proactive Monitoring Measures
    • Carsten Ullrich
       
      this is a bit too narrow, proactivity means really a rsk based approach, nit just monitoring, but monitoring for threats and risks
  • The eCommerce Directive and DMCA both permit certain injunctions, even against intermediaries that are otherwise immune from damages. Here again, the platform’s existing capabilities – its capacity to know about and control user content – matter. In the U.K. Mosley v. Google case, for example, the claimant successfully argued that because Google already used technical filters to block illegal child sexual abuse material, it could potentially be compelled to filter the additional images at image in his case.
Carsten Ullrich

Happy Birthday: The E-Commerce Directive Turns 20 - Disruptive Competition Project - 0 views

  • o be as effective as the ECD, the DSA should be a horizontal principle-based legislative initiative, which could be complemented by targeted measures (legislative and non-legislative) tackling specific concerns. 
Carsten Ullrich

A New Blueprint for Platform Governance | Centre for International Governance Innovation - 0 views

  • We often talk about the “online environment.” This metaphorical language makes it seem like the online space looks similar to our offline world. For example, the term “information pollution,” coined by Claire Wardle, is increasingly being used to discuss disinformation online.  
  • It is even harder to prove direct connections between online platforms and offline harms. This is partly because platforms are not transparent.
  • Finally, this analogy reminds us that both problems are dispiritingly hard to solve. Two scholars, Whitney Phillips and Ryan Milner, have suggested that our online information problems are ecosystemic, similar to the climate crisis.
  • ...12 more annotations...
  • As Phillips argues, “we’re not going to solve the climate crisis if people just stop drinking water out of water bottles. But we need to start minimizing the amount of pollution that’s even put into the landscape. It’s a place to start; it’s not the place to end.”
  • There may not be a one-size-fits-all analogy for platforms, but “horizontalizing” can help us to understand which solutions worked in other industries, which were under-ambitious and which had unintended consequences. Comparing horizontally also reminds us that the problems of how to regulate the online world are not unique, and will prove as difficult to resolve as those of other large industries.  
  • The key to vertical thinking is to figure out how not to lock in incumbents or to tilt the playing field even more toward them. We often forget that small rivals do exist, and our regulation should think about how to include them. This means fostering a market that has room for ponies and stable horses as well as unicorns.
  • Vertical thinking has started to spread in Washington, DC. In mid January, the antitrust subcommittee in Congress held a hearing with four smaller tech firms. All of them asked for regulatory intervention. The CEO of phone accessory maker PopSockets called Amazon’s behaviour “bullying with a smile.” Amazon purportedly ignored the selling of counterfeited PopSocket products on its platform and punished PopSocket for wanting to end its relationship with Amazon. Both Republicans and Democrats seemed sympathetic to smaller firms’ travails. The question is how to adequately address vertical concerns.
  • Without Improved Governance, Big Firms Will Weaponize Regulation
  • One is the question of intellectual property. Pa
  • Big companies can marshall an army of lawyers, which even medium-sized firms could never afford to do.
  • A second aspect to consider is sliding scales of regulation.
  • A third aspect is burden of proof. One option is to flip the present default and make big companies prove that they are not engaging in harmful behaviour
  • The EU head of antitrust, Margrethe Vestager, is considering whether to turn this on its head: in cases where the European Union suspects monopolistic behaviour, major digital platforms would have to prove that users benefit from their services.
  • Companies would have to prove gains, rather than Brussels having to prove damages. This change would relieve pressure on smaller companies to show harms. It would put obligations on companies such as Google, which Vestager sees as so dominant that she has called them “de facto regulators” in their markets. 
  • A final aspect to consider is possibly mandating larger firms to open up.
Carsten Ullrich

The European Commission's Proposal for a Regulation on Preventing the Dissemination of ... - 0 views

  •  
    "10.30709/eucrim-2018-024"
Carsten Ullrich

My Library - 0 views

  • that the elements which
  • re relevant for assessing whether the proprietor of an EU trade mark is entitled to prohibit the use of a sign in part of the European Union not covered by that action, may be taken into account by that court
  • Although, for the purpose of assessing whether Ornua is entitled to prohibit the use of the sign KERRYMAID in Spain, the referring court should consider taking into account elements present in Ireland and the United Kingdom, it should first of all ensure that there is no significant difference between the market conditions or the sociocultural circumstances
  • ...4 more annotations...
  • In that regard, account should be taken, in particular, of the overall presentation of the product marketed by the third party, the circumstances in which a distinction is made between that mark and the sign used by that the third party, and the effort made by that third party to ensure that consumers distinguish its products from those of which it is not the trade mark owner
  • in part of the European Union, an EU trade mark with a reputation and a sign peacefully coexist
  • It cannot be excluded that the conduct which can be expected of the third party so that its use of the sign follows honest practices in industrial or commercial matters must be analysed differently in a part of the European Union where consumers have a particular affinity with the geographical word contained in the mark and the sign at issue than in a part of the European Union where that affinity is weaker.
  • allows the conclusion that in another part of the European Union, where that peaceful coexistence is absent, there is due cause legitimising the use of that sign.
Carsten Ullrich

The battle against disinformation is global - Scott Shackelford | Inforrm's Blog - 0 views

  • the EU is spending more money on combating disinformation across the board by hiring new staff with expertise in data mining and analytics to respond to complaints and proactively detect disinformation
  • EU also seems to be losing patience with Silicon Valley. It pressured social media giants like Facebook, Google and Twitter to sign the Code of Practice on Disinformation in 2018.
Carsten Ullrich

Article - 0 views

  • Internet Forum in 2015 in response to the alarming increase in the use of the Internet by terrorists to spread extremist propaganda
  • facilitation of a rapid and coordinated cross-border response mechanism to contain the spread of terrorist content online
  • Protocol only applies in exceptional situations, when national crisis management procedures prove insufficient.
Carsten Ullrich

Article - 0 views

  • elf-assessment reports submitted by Facebook, Google, Microsoft, Mozilla and Twitter
  • bserved that “[a]ll platform signatories deployed policies and systems to ensure transparency around political advertising, including a requirement that all political ads be clearly labelled as sponsored content and include a ‘paid for by’ disclaimer.”
  • While some of the platforms have gone to the extent of banning political ads, the transparency of issue-based advertising is still significantly neglected.
  • ...5 more annotations...
  • re are notable differences in scop
  • inauthentic behaviour, including the suppression of millions of fake accounts and the implementation of safeguards against malicious automated activities.
  • more granular information is needed to better assess malicious behaviour specifically targeting the EU and the progress achieved by the platforms to counter such behaviour.”
  • several tools have been developed to help consumers evaluate the reliability of information sources, and to open up access to platform data for researchers.
    • Carsten Ullrich
       
      one element of a technical standard, degree of providing consumer with transparent to content assessment tools, transparency still lagging!
  • platforms have not demonstrated much progress in developing and implementing trustworthiness indicators in collaboration with the news ecosystem”, and “some consumer empowerment tools are still not available in most EU Member States.”
Carsten Ullrich

Article - 0 views

  • new measures are designed to make it easier to identify hate crime on the Internet. In future, platforms such as Facebook, Twitter and YouTube will not only be able to delete posts that incite hatred or contain death threats, but also report them to the authorities, along with the user’s IP address.
  • ossibility of extending the scope of the Netzwerkdurchsetzungsgesetz
  • new rules on hate crime will be added to the German Strafgesetzbuch (Criminal Code), while the definition of existing offences will be amended to take into account the specific characteristics of the Internet.
    • Carsten Ullrich
       
      internet specific normative considerations?
Carsten Ullrich

Article - 0 views

  • On 6 February 2020, the audiovisual regulator of the French-speaking community of Belgium (Conseil supérieur de l’audiovisuel – CSA) published a guidance note on the fight against certain forms of illegal Internet content, in particular hate speech
  • In the note, the CSA begins by summarising the current situation, highlighting the important role played by content-sharing platforms and their limited responsibility. It emphasises that some content can be harmful to young people in particular, whether they are the authors or victims of the content. It recognises that regulation, in its current form, is inappropriate and creates an imbalance between the regulation of online content-sharing platform operators, including social networks, and traditional players in the audiovisual sector
  • ould take its own legislative measures without waiting for work to start on an EU directive on the subject. 
  • ...6 more annotations...
  • f it advocates crimes against humanity; incites or advocates terrorist acts; or incites hatred, violence, discrimination or insults against a person or a group of people on grounds of origin, alleged race, religion, ethnic background, nationality, gender, sexual orientation, gender identity or disability, whether real or alleged.
  • obligations be imposed on the largest content-sharing platform operators, that is, any natural or legal person offering, on a professional basis, whether for remuneration or not, an online content-sharing platform, wherever it is based, used by at least 20% of the population of the French-speaking region of Belgium or the bilingual Brussels-Capital region.
  • iged to remove or block content notified to them that is ‘clearly illegal’ within 24 hours. T
  • need to put in place reporting procedures as well as processes for contesting their decisions
  • appoint an official contact person
  • half-yearly report on compliance with their obligation
Carsten Ullrich

Article - 0 views

  • Entwurf für ein Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität
  • oviders of commercial telemedia services and associated contributors and intermediaries will, in future, be subject to the same information obligations as telecommunications services. A new Article 15a TMG obliges them to disclose information about their users’ inventory data if requested by the Federal Office for the Protection of the Constitution, law enforcement or police authorities, the Militärische Abschirmdienst (Military Counterintelligence Service), the Bundesnachrichtendienst (Federal Intelligence Service) or customs authorities
  • To this end, they are required, at their own expense, to make arrangements for the disclosure of such information within their field of responsibility. Services with over 100 000 customers must also provide a secure electronic interface for this purpose.
  • ...2 more annotations...
  • Social network providers, meanwhile, are subject to proactive reporting obligations
  • The provider must check whether this is the case and report the content immediately, as well as provide the IP address and port number of the person responsible. The user “on whose behalf the content was stored” should be informed that the information has been passed on to the BKA, unless the BKA orders otherwise.
« First ‹ Previous 61 - 80 of 177 Next › Last »
Showing 20 items per page