Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged policy

Rss Feed Group items tagged

Carsten Ullrich

The Next Wave of Platform Governance - Centre for International Governance Innovation - 0 views

  • he shift from product- and service-based to platform-based business creates a new set of platform governance implications — especially when these businesses rely upon shared infrastructure from a small, powerful group of technology providers (Figure 1).
  • The industries in which AI is deployed, and the primary use cases it serves, will naturally determine the types and degrees of risk, from health and physical safety to discrimination and human-rights violations. Just as disinformation and hate speech are known risks of social media platforms, fatal accidents are a known risk of automobiles and heavy machinery, whether they are operated by people or by machines. Bias and discrimination are potential risks of any automated system, but they are amplified and pronounced in technologies that learn, whether autonomously or by training, from existing data.
  • Business Model-Specific Implications
  • ...7 more annotations...
  • The implications of cloud platforms such as Salesforce, Microsoft, Apple, Amazon and others differ again. A business built on a technology platform with a track record of well-developed data and model governance, audit capability, responsible product development practices and a culture and track record of transparency will likely reduce some risks related to biased data and model transparency, while encouraging (and even enforcing) adoption of those same practices and norms throughout its ecosystem.
  • policies that govern their internal practices for responsible technology development; guidance, tools and educational resources for their customers’ responsible use of their technologies; and policies (enforced in terms of service) that govern the acceptable use of not only their platforms but also specific technologies, such as face recognition or gait detection.
  • At the same time, overreliance on a small, well-funded, global group of technology vendors to set the agenda for responsible and ethical use of AI may create a novel set of risks.
  • Audit is another area that, while promising, is also fraught with potential conflict. Companies such as O’Neil Risk Consulting and Algorithmic Auditing, founded by the author of Weapons of Math Destruction, Cathy O’Neil, provide algorithmic audit and other services intended to help companies better understand and remediate data and model issues related to discriminatory outcomes. Unlike, for example, audits of financial statements, algorithmic audit services are as yet entirely voluntary, lack oversight by any type of governing board, and do not carry disclosure requirements or penalties. As a result, no matter how thorough the analysis or comprehensive the results, these types of services are vulnerable to manipulation or exploitation by their customers for “ethics-washing” purposes.
  • , we must broaden our understanding of platforms beyond social media sites to other types of business platforms, examine those risks in context, and approach governance in a way that accounts not only for the technologies themselves, but also for the disparate impacts among industries and business models.
  • This is a time-sensitive issue
  • arge technology companies — for a range of reasons — are trying to fill the policy void, creating the potential for a kind of demilitarized zone for AI, one in which neither established laws nor corporate policy hold sway.
Carsten Ullrich

American Internet, American Platforms, American Values - Centre for International Gover... - 0 views

  • Non-Americans should not be satisfied with this state of affairs, which basically amounts to Americans fighting with other Americans about how to run the world.
    • Carsten Ullrich
       
      !!!
  • that is, the idea that people should have a say in the rules that govern their activities. The Manila Principles, moreover, place an inordinate emphasis on domestic courts to regulate platforms, even though, as my co-author Keller notes, courts lack the expertise and policy-making capacity to do so.
  • What all of these proposals have in common, beyond adopting the American free-speech debate as their starting point, is that they treat these large platforms as an unalterable fact of life. They consider the main question to be not whether these platforms should be making decisions for billions of non-Americans, but how they should make these decisions.
  • ...10 more annotations...
  • he democratic right for non-Americans to determine the rules under which we should live is not even considered. Instead, attempts by democratic governments to impose legitimate democratic regulation on these companies, many of which have assumed the status of essential infrastructure, is derided as creeping authoritarianism or as a threat to the free and open internet.
  • At the very least, thinking of internet governance in these terms should make us more sympathetic to attempts by the Australian, Canadian, German and United Kingdom governments to legislate in this area, rather than be dismissive of the legitimacy of (democratic) governance on its face. If we value democratic oversight, state regulation is almost the only game in town, an approach that can be complemented with international treaty-making among democratic states so as to create agreed-upon minimum standards for regulating cross-border platform activities.
  • o address the first question, in a sense, the global American platforms are free riders on the notion that the internet as a network should be global in reach. Here, a useful analogy is the global financial system. Although we have a global financial system, it is characterized by domestic regulation and, in many countries
  • many of the social harms perpetuated by platforms are the likely result of their business models, which incentivize extremist speech and pervasive surveillance
  • Speech regulation without addressing these root causes is unlikely to be successful. If tools such as internet search functions truly have become essential to knowledge discovery and exhibit natural monopoly characteristics, countries should have the ability to determine for themselves what form they should take. To be blunt, public ownership should be on the table, even if it isn’t, currently, in the United States.
  • Google’s threat (which mirrored Facebook’s) to cut off its search service to Australia was likely due as much, if not more, to Australia’s plan to exercise oversight over its proprietary algorithm than it was about Australia’s plan to force Google to give a cut of its revenues to various Australian media outlets. The harshness of this threat highlights exactly how hard it will be for non-US countries to exert any meaningful control over the services currently monopolized by these US companies.
  • Already, the United States, as the home of these companies, is working to solidify the market and social dominance of its platforms.
  • As already mentioned, the CUSMA contains provisions protecting free cross-border data flows that, while justified in terms of encouraging trade, serve to preserve the dominance of the US platforms in Canada and Mexico. To this, we can add its successful inclusion of CDA Section 230 language in the agreement, effectively pre-empting Canadian and Mexican debates over what values we wish to apply to platform governance.
  • he first step to coming up with a sound policy involves understanding the policy terrain. In internet governance, and particularly in platform governance, this involves understanding the extent to which the dominant debates and landscape reflect particular US interests and values
  • hese interests and values do not necessarily reflect those of people living in other countries. Both Canadians and Americans believe in free speech and market competition. However, our interpretations of the limits of each differ. This reality — the acknowledgement of legitimate differences and the necessity of democratic accountability — should be our starting point in discussions of internet governance, not the desire to preserve a global internet and platform ecosystem that is much less global, and much more American, than it appears.
Carsten Ullrich

A more transparent and accountable Internet? Here's how. | LSE Media Policy Project - 0 views

  • Procedural accountability” was a focus of discussion at the March 2018 workshop on platform responsibility convened by LSE’s Truth, Trust and Technology Commission. The idea is that firms should be held to account for the effectiveness of their internal processes in tackling the negative social impact of their services.
  • o be credible and trusted, information disclosed by online firms will need to be independently verified.
  • Piloting a Transparency Reporting Framework
Carsten Ullrich

Facebook's Hate Speech Policies Censor Marginalized Users | WIRED - 0 views

  •  
    example of incorrect filtering advanced by LGBT groups
Carsten Ullrich

European regulation of video-sharing platforms: what's new, and will it work? | LSE Med... - 0 views

  • his set of rules creates a novel regulatory model
  • Again, leaving regulatory powers to a private entity without any public oversight is clearly not the right solution. But this is also not what, in my opinion, the new AVMSD does
  • But without transparency and information about individual cases, you surely can’t say whether the takedowns are really improving the media environment, or the providers are just trying to get rid of any controversial content – or, indeed, the content somebody just happens to be complaining about.
  • ...4 more annotations...
  • he regulator, on the other hand, has a more detached role, when compared to older types of media regulation, in which they mainly assess whether mechanisms established by the provider comply with the law
  • This approach gives rise to concerns that we are just outsourcing regulation to private companies.
  • Indeed, the delegation of the exercise of regulatory powers to a private entity could be very damaging to freedom of speech and media.
  • So, I think the legal groundwork for protection but also the fair treatment of users is in the directive. Now it depends on the member states to implement it in such a way that this potential will be fulfilled (and the European Commission has a big role in this process).
Carsten Ullrich

Facebook and the EU, or the failure of self-regulation | The Guest Blog - 0 views

  • How did we let this happen? Why do we appear so weak?
  • For years Brussels has been the champion of self-regulation. The dogma is – at least publicly – based on the assumption that companies know best how to tackle some of the challenges.
  • Our failure to understand the underlying challenges and a failure of regulation.
  • ...2 more annotations...
  • If it’s the latter, then we have to move away from self-regulation. We can’t continue defending self-regulation and fake outrage when what we already knew becomes public.
  • Some will shift all the blame to Facebook, but we are at least as responsible as they are. EU decision-makers let this happen with self-regulation and soft policy.
Carsten Ullrich

Article - 0 views

  • elf-assessment reports submitted by Facebook, Google, Microsoft, Mozilla and Twitter
  • bserved that “[a]ll platform signatories deployed policies and systems to ensure transparency around political advertising, including a requirement that all political ads be clearly labelled as sponsored content and include a ‘paid for by’ disclaimer.”
  • While some of the platforms have gone to the extent of banning political ads, the transparency of issue-based advertising is still significantly neglected.
  • ...5 more annotations...
  • re are notable differences in scop
  • inauthentic behaviour, including the suppression of millions of fake accounts and the implementation of safeguards against malicious automated activities.
  • more granular information is needed to better assess malicious behaviour specifically targeting the EU and the progress achieved by the platforms to counter such behaviour.”
  • several tools have been developed to help consumers evaluate the reliability of information sources, and to open up access to platform data for researchers.
    • Carsten Ullrich
       
      one element of a technical standard, degree of providing consumer with transparent to content assessment tools, transparency still lagging!
  • platforms have not demonstrated much progress in developing and implementing trustworthiness indicators in collaboration with the news ecosystem”, and “some consumer empowerment tools are still not available in most EU Member States.”
Carsten Ullrich

JIPLP: Editorial - Control of content on social media - 0 views

  • Can technology resolve these issues? As regards technical solutions, there are already examples of these, such as YouTube’s Content ID, an automated piece of software that scans material uploaded to the site for IP infringement by comparing it against a database of registered IPs. The next challenge may be how these types of systems can be harnessed by online platform providers to address extreme and hate crime content. Again the dilemma for policy- and law-makers may be the extent to which they are prepared to cede control over content to technology companies, which will become judge, jury and executioner. 
  • who should bear the cost of monitoring and removal.
  • o block access to websites where infringing content has been hosted. In Cartier International AG & Ors v British Sky Broadcasting Ltd & Ors [2016] EWCA civ 658 the Court of Appeal concluded that it is entirely reasonable to expect ISPs to pay the costs associated with implementing mechanisms to block access to sites where infringing content has been made available
  • ...1 more annotation...
  • Thus the cost of implementing the order could therefore be regarded as just another overhead associated with ISPs carrying on their business
Carsten Ullrich

Tech companies can distinguish between free speech and hate speech if they want to - Da... - 0 views

  • Facebook has come under recent criticism for censoring LGBTQ people’s posts because they contained words that Facebook deem offensive. At the same time, the LGBTQ community are one of the groups frequently targetted with hate speech on the platform. If users seem to “want their cake and eat it too”, the tech companies are similarly conflicted.
  • At the same time, the laws of many countries like Germany, and other international conventions, explicitly limit these freedoms when it comes to hate speech.
  • It would not be impossible for tech companies to form clear guidelines within their own platforms about what was and wasn’t permissable. For the mainly US companies, this would mean that they would have to be increasingly aware of the differences between US law and culture and those of other countries.
Carsten Ullrich

IRIS Newsletter - 0 views

    • Carsten Ullrich
       
      ask Cedric for background and how it works, especially the algorithmic transparency
  • On 19 September, Google and the Association to Combat Audiovisual Piracy (Association de Lutte contre la Piraterie Audiovisuelle - “ALPA”) signed a partnership agreement aimed at effectively reinforcing copyright protection for the on-line exploitation of audiovisual works.
  • under the auspices of the National Centre for the Cnema (Centre National du Cinéma - “the CNC”
  • ...3 more annotations...
  • oogle’s video platform, YouTube, will make its content ID algorithm available to ALPA.
  • The algorithm is a tool for identifying and managing rights; ALPA will be able to apply the “block” and “follow” rules directly for any work placed on-line without the authorisation of the respective rights-holders. In this way it will be possible for rights-holders to add their works to the content ID filter and to ensure that their films and productions are not placed on YouTube without their consent. Google also undertakes to prevent its AdWords service from fraudulently buying key words for pirate streaming and downloading sites. It also undertakes to provide ALPA with financial support; the agreement is witness to its determination to contribute to the fight against piracy and to strengthen its policy of cooperation with originators and rights-holders.
  • The President of ALPA, Nicolas Seydoux, welcomed the agreement, which he said symbolised “the collapse of a wall of incomprehension” between Google and ALPA
  •  
    check with Cedric on background
1 - 20 of 22 Next ›
Showing 20 items per page