Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged online

Rss Feed Group items tagged

Carsten Ullrich

A New Blueprint for Platform Governance | Centre for International Governance Innovation - 0 views

  • We often talk about the “online environment.” This metaphorical language makes it seem like the online space looks similar to our offline world. For example, the term “information pollution,” coined by Claire Wardle, is increasingly being used to discuss disinformation online.  
  • It is even harder to prove direct connections between online platforms and offline harms. This is partly because platforms are not transparent.
  • Finally, this analogy reminds us that both problems are dispiritingly hard to solve. Two scholars, Whitney Phillips and Ryan Milner, have suggested that our online information problems are ecosystemic, similar to the climate crisis.
  • ...12 more annotations...
  • As Phillips argues, “we’re not going to solve the climate crisis if people just stop drinking water out of water bottles. But we need to start minimizing the amount of pollution that’s even put into the landscape. It’s a place to start; it’s not the place to end.”
  • There may not be a one-size-fits-all analogy for platforms, but “horizontalizing” can help us to understand which solutions worked in other industries, which were under-ambitious and which had unintended consequences. Comparing horizontally also reminds us that the problems of how to regulate the online world are not unique, and will prove as difficult to resolve as those of other large industries.  
  • The key to vertical thinking is to figure out how not to lock in incumbents or to tilt the playing field even more toward them. We often forget that small rivals do exist, and our regulation should think about how to include them. This means fostering a market that has room for ponies and stable horses as well as unicorns.
  • Vertical thinking has started to spread in Washington, DC. In mid January, the antitrust subcommittee in Congress held a hearing with four smaller tech firms. All of them asked for regulatory intervention. The CEO of phone accessory maker PopSockets called Amazon’s behaviour “bullying with a smile.” Amazon purportedly ignored the selling of counterfeited PopSocket products on its platform and punished PopSocket for wanting to end its relationship with Amazon. Both Republicans and Democrats seemed sympathetic to smaller firms’ travails. The question is how to adequately address vertical concerns.
  • Without Improved Governance, Big Firms Will Weaponize Regulation
  • One is the question of intellectual property. Pa
  • Big companies can marshall an army of lawyers, which even medium-sized firms could never afford to do.
  • A second aspect to consider is sliding scales of regulation.
  • A third aspect is burden of proof. One option is to flip the present default and make big companies prove that they are not engaging in harmful behaviour
  • The EU head of antitrust, Margrethe Vestager, is considering whether to turn this on its head: in cases where the European Union suspects monopolistic behaviour, major digital platforms would have to prove that users benefit from their services.
  • Companies would have to prove gains, rather than Brussels having to prove damages. This change would relieve pressure on smaller companies to show harms. It would put obligations on companies such as Google, which Vestager sees as so dominant that she has called them “de facto regulators” in their markets. 
  • A final aspect to consider is possibly mandating larger firms to open up.
Carsten Ullrich

What Facebook isn't telling us about its fight against online abuse - Laura Bliss | Inf... - 0 views

  • In a six-month period from October 2017 to March 20178, 21m sexually explicit pictures, 3.5m graphically violent posts and 2.5m forms of hate speech were removed from its site. These figures help reveal some striking points.
  • As expected, the data indicates that the problem is getting worse.
    • Carsten Ullrich
       
      problem is getting worse - use as argument - look at facebook report
  • For instance, between January and March it was estimated that for every 10,000 messages online, between 22 and 27 contained graphic violence, up from 16 to 19 in the previous three months.
  • ...9 more annotations...
  • Here, the company has been proactive. Between January and March 2018, Facebook removed 1.9m messages encouraging terrorist propaganda, an increase of 800,000 comments compared to the previous three months. A total of 99.5% of these messages were located with the aid of advancing technology.
  • But Facebook hasn’t released figures showing how prevalent terrorist propaganda is on its site. So we really don’t know how successful the software is in this respect.
    • Carsten Ullrich
       
      we need data this would be part of my demand for standardized reporting system
  • on self-regulation,
  • Between the two three-month periods there was a 183% increase in the amount of posts removed that were labelled graphically violent. A total of 86% of these comments were flagged by a computer system.
  • But we also know that Facebook’s figures also show that up to 27 out of every 10,000 comments that made it past the detection technology contained graphic violence.
  • One estimate suggests that 510,000 comments are posted every minute. If accurate, that would mean 1,982,880 violent comments are posted every 24 hours.
  • Facebook has also used technology to aid the removal of graphic violence from its site.
  • This brings us to the other significant figure not included in the data released by Facebook: the total number of comments reported by users. As this is a fundamental mechanism in tackling online abuse, the amount of reports made to the company should be made publicly available
  • However, even Facebook still has a long way to go to get to total transparency. Ideally, all social networking sites would release annual reports on how they are tackling abuse online. This would enable regulators and the public to hold the firms more directly to account for failures to remove online abuse from their servers.
    • Carsten Ullrich
       
      my demand - standardized reporting
Carsten Ullrich

Online Harms: Government publishes response to consultation, Ofcom to be given powers t... - 0 views

  • A small group of companies with the largest online presence and high-risk features, which is likely to include Facebook, TikTok, Instagram and Twitter, will be in Category 1, while Category 2 services include platforms that host dating services or pornography and private messaging apps. The Government has said that less than 3% of UK businesses will fall within the scope of the legislation and the vast majority of companies that do will be Category 2 services, the UK government.
Carsten Ullrich

How to regulate Facebook and the online giants in one word: transparency - George Brock... - 0 views

  • New responsibilities arise from these changes.
  • Greater transparency will disclose whether further regulation is required and make it better targeted, providing specific remedies for clearly identified ills.
  • If Facebook and others must account in detail to an electoral commission or data protection authority for micro-targeting or “dark” ads, are forbidden from deleting certain relevant data, and must submit to algorithm audits, they will forced to foresee and to try to solve some of the problems which they have been addressing so slowly
  • ...1 more annotation...
  • ansparency would have its own radical effect inside the tech giants
Carsten Ullrich

The white paper on online harms is a global first. It has never been more needed | John... - 0 views

  • Could it be, another wondered, that the flurry of apocalyptic angst reflected the extent to which the Californian Ideology (which held that cyberspace was beyond the reach of the state) had seeped into the souls of even well-intentioned critics?
  • In reality, the problem we have is not the internet so much as those corporations that ride on it and allow some unacceptable activities to flourish on their platforms
  • This is what ethicists call “obligation responsibility” and in this country we call a duty of care. I
  • ...8 more annotations...
  • corporate responsibility
  • Since the mid-1990s, internet companies have been absolved from liability – by Section 230 of the 1996 US Telecommunications Act and to some extent by the EU’s e-commerce directive – for the damage that their platforms do.
  • Sooner or later, democracies will have to bring these outfits under control and the only question is how best to do it. The white paper suggests one possible way forward.
  • essentially a responsibility for unintended consequences of the way you have set up and run your business.
  • The white paper says that the government will establish a new statutory duty of care on relevant companies “to take reasonable steps to keep their users safe and tackle illegal and harmful activity on their services”.
  • for example assessing and responding to the risk associated with emerging harms or technology
  • Stirring stuff, eh? It has certainly taken much of the tech industry aback, especially those for whom the idea of government regulation has always been anathema and who regard this fancy new “duty of care’ as a legal fantasy dreamed up in an undergraduate seminar.
  • To which the best riposte is perhaps the old Chinese proverb that the longest journey begins with a single step. This white paper is it.
Carsten Ullrich

Upload filters, copyright and magic pixie dust - Copybuzz - 0 views

  • At the heart of the initiative is a plan for online platforms to “increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.” Significantly, the ideas are presented as “guidelines and principles”. That’s because they are entirely voluntary. Except that the Commission makes it quite clear that if this totally voluntary system is not implemented by companies like Facebook and Google, it will bring in new laws to make them do it on a not-so-voluntary basis. The Commission is quite eager to see swift results from these voluntary efforts, as legislative proposals could already be on the table by May 2018.
  • But the worst idea, and one that appears multiple times in the latest plans, is the routine and pervasive use of upload filters.
  • In doing so, they have caused notable collateral damage, especially to fundamental rights.
  • ...3 more annotations...
  • The European Commission is well aware that Article 15 of the E-Commerce Directive explicitly prohibits Member States from imposing “a general obligation on providers … to monitor the information which they transmit or store, [or] a general obligation actively to seek facts or circumstances indicating illegal activity.
  • does indeed involve a “general obligation” on those companies to filter all uploads for a vast range of “illegal content”
  • That lack of good faith makes the Commission’s stubborn insistence on a non-existent technical solution to a non-existent problem even more frustrating. If it had the courage to admit the truth about the unproblematic nature of unauthorised sharing of copyright materials, it wouldn’t need to come up with unhelpful approaches like upload filters that are certain to cause immense harm to both the online world and to the EU’s Digital Single Market.
Carsten Ullrich

Digital Services Act: Ensuring a trustworthy and safe online environment while allowing... - 0 views

  • The EU’s overall objectives are certainly well-intended. However, many concerns remain, for instance:
  • The DSA should tackle bad players and behaviours regardless of the platform’s size and country of origin. Having a specific regime for “very large online platforms” with additional obligations leaves the door open for rogue players to simply move to smaller digital service providers that are subject to a lighter regime.
  • To prevent legal uncertainty, the DSA should have a clear scope focusing on illegal content, products and services. The rules should be horizontal and principle-based, and could in a second phase be complemented with more targeted measures (legislative and non-legislative) to tackle specific concerns. 
  • ...3 more annotations...
  • While well-intended, EU policymakers should find the appropriate equilibrium between transparency, the protection against rogue players’ attempts to game the system, and the protection of operators’ trade secrets. Any new requirement must be achievable, proportionate to known risks and provide real added value.
  • Undermining the ‘country of origin’ principle would fragment the EU Single Market and create more red tape for national businesses trying to become European businesses.
  • To prevent legal uncertainty, the DSA should have a clear scope focusing on illegal content, products and services. The rules should be horizontal and principle-based, and could in a second phase be complemented with more targeted measures (legislative and non-legislative) to tackle specific concerns. 
Carsten Ullrich

Is the Era of "Permissionless Innovation" and Avoidance of Regulation on the Internet F... - 0 views

  • avoidance of regulation that the Silicon Valley platforms
  • It hasn’t been a great couple of weeks for the “Don’t Be Evil” company.
  • The Supreme Court had upheld a lower court ruling requiring Google to delist from its global search results references to a rogue Canadian company that is the subject of an injunction in British Columbia (B.C) f
  • ...14 more annotations...
  • intellectual property infringement.
  • The Google/Equustek case is not one of permissionless innovation, but is still an example of a large internet intermediary taking the position that it can do as it damned well pleases because, after all, it operates in multiple jurisdictions—in fact it operates in cyberspace, where, according to some, normal regulatory practices and laws shouldn’t apply or we will “stifle innovation”.
  • One innovation that Google has instituted is to tweak its geolocation system
  • The excuse of “it’s not my fault; blame the algorithm”, also won’t fly anymore. Google’s algorithms are the “secret sauce” that differentiates it from its competitors, and the dominance of Google is proof of the effectiveness of its search formulae.
    • Carsten Ullrich
       
      courts have become streetwise on the "algorithm"
  • But scooping up every bit of information and interpreting what people want (or what Google thinks they want) through an algorithm has its downsides. A German court has found that Google cannot hide behind its algorithms when it comes to producing perverse search results
  • AI is great, until it isn’t, and there is no doubt that regulators will start to look at legal issues surrounding AI.
  • Companies like Google and Facebook will not be able to duck their responsibility just because results that are potentially illegal are produced by algorithms or AI
  • One area where human judgement is very much involved is in the placing of ads, although Youtube and others are quick to blame automated programs when legitimate ads appear alongside questionable or illegal content. Platforms have no obligation to accept ads as long as they don’t engage in non-competitive trade practices
  • Google has already learned its lesson on pharmaceutical products the hard way, having been fined $500 million in 2011 for running ads on its Adwords service from unlicenced Canadian online pharmacies illegally (according to US law) selling prescriptions to US consumers.
  • Google is a deep-pocketed corporation but it seems to have got the message when it comes to pharmaceuticals. What galls me is that if Google can remove Adwords placements promoting illegal drug products, why, when I google “watch pirated movies”, do I get an Adwords listing on page 1 of search that says “Watch HD Free Full Movies Online”.
  • At the end of the day whether it is Google, Facebook, Amazon, or any other major internet intermediary, the old wheeze that respect for privacy, respect for copyright and just plain old respect for the law in general gets in the way of innovation is being increasingly shown to be a threadbare argument.
  • What is interesting is that many cyber-libertarians who oppose any attempt to impose copyright obligations and publishing liability on internet platforms are suddenly starting to get nervous about misuse of data by these same platforms when it comes to privacy.
  • This is a remarkable revelation for someone who has not only advocated that Canada adopt in NAFTA the overly-broad US safe harbour provisions found in the Communications Decency Act, a provision that has been widely abused in the US by internet intermediaries as a way of ducking any responsibility for the content they make available, but who has consistently crusaded against any strengthening of copyright laws that might impose greater obligations on internet platforms.
  • proponents of reasonable internet regulation
Carsten Ullrich

Online Harms White Paper: Two comments on "harms" - Hugh Tomlinson QC | Inforrm's Blog - 0 views

  • umber of the other “harms” identified in the White Paper may also constitute breaches of data protection law.
1 - 20 of 61 Next › Last »
Showing 20 items per page