Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged reporting

Rss Feed Group items tagged

Carsten Ullrich

What Facebook isn't telling us about its fight against online abuse - Laura Bliss | Inf... - 0 views

  • In a six-month period from October 2017 to March 20178, 21m sexually explicit pictures, 3.5m graphically violent posts and 2.5m forms of hate speech were removed from its site. These figures help reveal some striking points.
  • As expected, the data indicates that the problem is getting worse.
    • Carsten Ullrich
       
      problem is getting worse - use as argument - look at facebook report
  • For instance, between January and March it was estimated that for every 10,000 messages online, between 22 and 27 contained graphic violence, up from 16 to 19 in the previous three months.
  • ...9 more annotations...
  • Here, the company has been proactive. Between January and March 2018, Facebook removed 1.9m messages encouraging terrorist propaganda, an increase of 800,000 comments compared to the previous three months. A total of 99.5% of these messages were located with the aid of advancing technology.
  • But Facebook hasn’t released figures showing how prevalent terrorist propaganda is on its site. So we really don’t know how successful the software is in this respect.
    • Carsten Ullrich
       
      we need data this would be part of my demand for standardized reporting system
  • on self-regulation,
  • Between the two three-month periods there was a 183% increase in the amount of posts removed that were labelled graphically violent. A total of 86% of these comments were flagged by a computer system.
  • But we also know that Facebook’s figures also show that up to 27 out of every 10,000 comments that made it past the detection technology contained graphic violence.
  • One estimate suggests that 510,000 comments are posted every minute. If accurate, that would mean 1,982,880 violent comments are posted every 24 hours.
  • Facebook has also used technology to aid the removal of graphic violence from its site.
  • This brings us to the other significant figure not included in the data released by Facebook: the total number of comments reported by users. As this is a fundamental mechanism in tackling online abuse, the amount of reports made to the company should be made publicly available
  • However, even Facebook still has a long way to go to get to total transparency. Ideally, all social networking sites would release annual reports on how they are tackling abuse online. This would enable regulators and the public to hold the firms more directly to account for failures to remove online abuse from their servers.
    • Carsten Ullrich
       
      my demand - standardized reporting
Carsten Ullrich

Article - 0 views

  • Entwurf für ein Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität
  • oviders of commercial telemedia services and associated contributors and intermediaries will, in future, be subject to the same information obligations as telecommunications services. A new Article 15a TMG obliges them to disclose information about their users’ inventory data if requested by the Federal Office for the Protection of the Constitution, law enforcement or police authorities, the Militärische Abschirmdienst (Military Counterintelligence Service), the Bundesnachrichtendienst (Federal Intelligence Service) or customs authorities
  • To this end, they are required, at their own expense, to make arrangements for the disclosure of such information within their field of responsibility. Services with over 100 000 customers must also provide a secure electronic interface for this purpose.
  • ...2 more annotations...
  • Social network providers, meanwhile, are subject to proactive reporting obligations
  • The provider must check whether this is the case and report the content immediately, as well as provide the IP address and port number of the person responsible. The user “on whose behalf the content was stored” should be informed that the information has been passed on to the BKA, unless the BKA orders otherwise.
Carsten Ullrich

Article - 0 views

  • On 6 February 2020, the audiovisual regulator of the French-speaking community of Belgium (Conseil supérieur de l’audiovisuel – CSA) published a guidance note on the fight against certain forms of illegal Internet content, in particular hate speech
  • In the note, the CSA begins by summarising the current situation, highlighting the important role played by content-sharing platforms and their limited responsibility. It emphasises that some content can be harmful to young people in particular, whether they are the authors or victims of the content. It recognises that regulation, in its current form, is inappropriate and creates an imbalance between the regulation of online content-sharing platform operators, including social networks, and traditional players in the audiovisual sector
  • ould take its own legislative measures without waiting for work to start on an EU directive on the subject. 
  • ...6 more annotations...
  • f it advocates crimes against humanity; incites or advocates terrorist acts; or incites hatred, violence, discrimination or insults against a person or a group of people on grounds of origin, alleged race, religion, ethnic background, nationality, gender, sexual orientation, gender identity or disability, whether real or alleged.
  • obligations be imposed on the largest content-sharing platform operators, that is, any natural or legal person offering, on a professional basis, whether for remuneration or not, an online content-sharing platform, wherever it is based, used by at least 20% of the population of the French-speaking region of Belgium or the bilingual Brussels-Capital region.
  • iged to remove or block content notified to them that is ‘clearly illegal’ within 24 hours. T
  • need to put in place reporting procedures as well as processes for contesting their decisions
  • appoint an official contact person
  • half-yearly report on compliance with their obligation
Carsten Ullrich

Automated censorship is not the answer to extremism: unbalanced Home Affairs Committee ... - 0 views

  •  
    UK Parliamentary committee report comment
Carsten Ullrich

Facebook Publishes Enforcement Numbers for the First Time | Facebook Newsroom - 0 views

  • 86% of which was identified by our technology before it was reported to Facebook.
  • For hate speech, our technology still doesn’t work that well and so it needs to be checked by our review teams. We removed 2.5 million pieces of hate speech in Q1 2018 — 38% of which was flagged by our technology.
  • addition, in many areas — whether it’s spam, porn or fake accounts — we’re up against sophisticated adversaries who continually change tactics to circumvent our controls,
Carsten Ullrich

A more transparent and accountable Internet? Here's how. | LSE Media Policy Project - 0 views

  • Procedural accountability” was a focus of discussion at the March 2018 workshop on platform responsibility convened by LSE’s Truth, Trust and Technology Commission. The idea is that firms should be held to account for the effectiveness of their internal processes in tackling the negative social impact of their services.
  • o be credible and trusted, information disclosed by online firms will need to be independently verified.
  • Piloting a Transparency Reporting Framework
Carsten Ullrich

Article - 0 views

  • new measures are designed to make it easier to identify hate crime on the Internet. In future, platforms such as Facebook, Twitter and YouTube will not only be able to delete posts that incite hatred or contain death threats, but also report them to the authorities, along with the user’s IP address.
  • ossibility of extending the scope of the Netzwerkdurchsetzungsgesetz
  • new rules on hate crime will be added to the German Strafgesetzbuch (Criminal Code), while the definition of existing offences will be amended to take into account the specific characteristics of the Internet.
    • Carsten Ullrich
       
      internet specific normative considerations?
Carsten Ullrich

Article - 0 views

  • elf-assessment reports submitted by Facebook, Google, Microsoft, Mozilla and Twitter
  • bserved that “[a]ll platform signatories deployed policies and systems to ensure transparency around political advertising, including a requirement that all political ads be clearly labelled as sponsored content and include a ‘paid for by’ disclaimer.”
  • While some of the platforms have gone to the extent of banning political ads, the transparency of issue-based advertising is still significantly neglected.
  • ...5 more annotations...
  • re are notable differences in scop
  • inauthentic behaviour, including the suppression of millions of fake accounts and the implementation of safeguards against malicious automated activities.
  • more granular information is needed to better assess malicious behaviour specifically targeting the EU and the progress achieved by the platforms to counter such behaviour.”
  • several tools have been developed to help consumers evaluate the reliability of information sources, and to open up access to platform data for researchers.
    • Carsten Ullrich
       
      one element of a technical standard, degree of providing consumer with transparent to content assessment tools, transparency still lagging!
  • platforms have not demonstrated much progress in developing and implementing trustworthiness indicators in collaboration with the news ecosystem”, and “some consumer empowerment tools are still not available in most EU Member States.”
Carsten Ullrich

EUR-Lex - 52003DC0702 - EN - EUR-Lex - 0 views

  • Article 15 prevents Member States from imposing on internet intermediaries, with respect to activities covered by Articles 12-14, a general obligation to monitor the information which they transmit or store or a general obligation to actively seek out facts or circumstances indicating illegal activities. This is important, as general monitoring of millions of sites and web pages would, in practical terms, be impossible and would result in disproportionate burdens on intermediaries and higher costs of access to basic services for users. [73] However, Article 15 does not prevent public authorities in the Member States from imposing a monitoring obliga tion in a specific, clearly defined individual case.[73] In this context, it is important to note that the reports and studies on the effectiveness of blocking and filtering applications appear to indicate that there is not yet any technology which could not be circumvented and provide full effectiveness in blocking or filtering illegal and harmful information whilst at the same time avoiding blocking entirely legal information resulting in violations of freedom of speech.
    • Carsten Ullrich
       
      justifications mainly relate to economic viability and overblocking, but not surveillance
  •  
    justification for Article 15
Carsten Ullrich

CopyCamp Conference Discusses Fallacies Of EU Copyright Reform Amid Ideas For Copy Chan... - 0 views

  • Beyond the potential negative economic aspects, several speakers at the Copycamp conference rang the alarm bells over the potential fallout of round-the-clock obligatory monitoring and filtering of user content on the net. Diego Naranjo from the European Digital Rights initiative (EDRi) reported: “I heard one of the EU member state representatives say, ‘Why do we use this (filtering system) only for copyright?’,” he said. The idea of bringing down the unauthorised publication of copyrighted material by algorithm was “a very powerful tool in the hands of government,” he warned.
  • In contrast to the dark picture presented by many activists on copyright, multi-purpose filtering machines and the end of ownership in the time of the internet of things, chances for reform are presented for various areas of rights protection.
  • EU copyright reform itself is a chance, argued Raegan MacDonalds from the Mozilla Foundation, calling it “the opportunity of a generation to bring copyright in line with the digital age, and we want to do that.” Yet the task, like in earlier copyright legislative processes, is to once more expose what she described as later dismantled myths of big rights holders, that any attempt to harmonise exceptions would kill their industry.
1 - 12 of 12
Showing 20 items per page