[19] The Order of the Court will incorporate provision for liberty to apply. By this mechanism the Plaintiff, if necessary and if so advised, will be able to seek further relief from the Court if there is any recurrence of the offending publication. Of course, in such eventuality, it will be open to Facebook, acting responsibly and in accordance with the principles and themes clearly expressed in this judgment, to proactively take the necessary removal and closure steps.
XY v Facebook Ireland Ltd [2012] NIQB 96 (30 November 2012) - 0 views
-
-
[20] I refuse the Plaintiff's application for the wider form of interim injunction sought by him. This was to the effect that Facebook be required to monitor the offending webpage in order to prevent republication of the offensive material. In this respect, I prefer the argument of Mr Hopkins that such an order would lack the requisite precision, could impose a disproportionate burden and, further, would potentially require excessive supervision by the Court. See Cooperative Insurance v Argyll [1997] 3AL ER 297, pages 303 – 304, per Lord Hoffman. See also Halsbury's Laws of England, Volume 24 (Fourth Edition Reissue), paragraph 849. The propriety of granting this discrete remedy will, of course, be revisited at the substantive trial, against the backcloth of a fuller evidential matrix, which should include details of how this social networking site actually operates from day to day.
Article - 0 views
-
elf-assessment reports submitted by Facebook, Google, Microsoft, Mozilla and Twitter
-
bserved that “[a]ll platform signatories deployed policies and systems to ensure transparency around political advertising, including a requirement that all political ads be clearly labelled as sponsored content and include a ‘paid for by’ disclaimer.”
-
While some of the platforms have gone to the extent of banning political ads, the transparency of issue-based advertising is still significantly neglected.
- ...5 more annotations...
A New Blueprint for Platform Governance | Centre for International Governance Innovation - 0 views
-
We often talk about the “online environment.” This metaphorical language makes it seem like the online space looks similar to our offline world. For example, the term “information pollution,” coined by Claire Wardle, is increasingly being used to discuss disinformation online.
-
It is even harder to prove direct connections between online platforms and offline harms. This is partly because platforms are not transparent.
-
Finally, this analogy reminds us that both problems are dispiritingly hard to solve. Two scholars, Whitney Phillips and Ryan Milner, have suggested that our online information problems are ecosystemic, similar to the climate crisis.
- ...12 more annotations...
What mobile internet filtering tells us about porn blocks | Open Rights Group - 0 views
Digital Services Act: Ensuring a trustworthy and safe online environment while allowing... - 0 views
-
The EU’s overall objectives are certainly well-intended. However, many concerns remain, for instance:
-
The DSA should tackle bad players and behaviours regardless of the platform’s size and country of origin. Having a specific regime for “very large online platforms” with additional obligations leaves the door open for rogue players to simply move to smaller digital service providers that are subject to a lighter regime.
-
To prevent legal uncertainty, the DSA should have a clear scope focusing on illegal content, products and services. The rules should be horizontal and principle-based, and could in a second phase be complemented with more targeted measures (legislative and non-legislative) to tackle specific concerns.
- ...3 more annotations...
American Internet, American Platforms, American Values - Centre for International Gover... - 0 views
-
Non-Americans should not be satisfied with this state of affairs, which basically amounts to Americans fighting with other Americans about how to run the world.
-
that is, the idea that people should have a say in the rules that govern their activities. The Manila Principles, moreover, place an inordinate emphasis on domestic courts to regulate platforms, even though, as my co-author Keller notes, courts lack the expertise and policy-making capacity to do so.
-
What all of these proposals have in common, beyond adopting the American free-speech debate as their starting point, is that they treat these large platforms as an unalterable fact of life. They consider the main question to be not whether these platforms should be making decisions for billions of non-Americans, but how they should make these decisions.
- ...10 more annotations...
Broad Consequences of a Systemic Duty of Care for Platforms - Daphne Keller [Updated] |... - 0 views
-
n the up-side, flexible standards would give platforms more leeway to figure out meaningful technical improvements, and perhaps arrive at more nuanced automated assessment of content over tim
-
The down-sides of open-ended SDOC standards could be considerable, though. Proactive measures devised by platforms themselves would, even when coupled with transparency obligations, be far less subject to meaningful public review, accountability,
Algorithm Transparency: How to Eat the Cake and Have It Too - European Law Blog - 0 views
-
While AI tools still exist in a relative legal vacuum, this blog post explores: 1) the extent of protection granted to algorithms as trade secrets with exceptions of overriding public interest; 2) how the new generation of regulations on the EU and national levels attempt to provide algorithm transparency while preserving trade secrecy; and 3) why the latter development is not a futile endeavour.
-
most complex algorithms dominating our lives (including those developed by Google and Facebook), are proprietary, i.e. shielded as trade secrets, while only a negligible minority of algorithms are open source.
-
Article 2 of the EU Trade Secrets Directive
- ...11 more annotations...
1 - 9 of 9
Showing 20▼ items per page