Skip to main content

Home/ Duty of care + Standards _ CU/ Group items tagged terrorist

Rss Feed Group items tagged

Carsten Ullrich

What Facebook isn't telling us about its fight against online abuse - Laura Bliss | Inf... - 0 views

  • In a six-month period from October 2017 to March 20178, 21m sexually explicit pictures, 3.5m graphically violent posts and 2.5m forms of hate speech were removed from its site. These figures help reveal some striking points.
  • As expected, the data indicates that the problem is getting worse.
    • Carsten Ullrich
       
      problem is getting worse - use as argument - look at facebook report
  • For instance, between January and March it was estimated that for every 10,000 messages online, between 22 and 27 contained graphic violence, up from 16 to 19 in the previous three months.
  • ...9 more annotations...
  • Here, the company has been proactive. Between January and March 2018, Facebook removed 1.9m messages encouraging terrorist propaganda, an increase of 800,000 comments compared to the previous three months. A total of 99.5% of these messages were located with the aid of advancing technology.
  • But Facebook hasn’t released figures showing how prevalent terrorist propaganda is on its site. So we really don’t know how successful the software is in this respect.
    • Carsten Ullrich
       
      we need data this would be part of my demand for standardized reporting system
  • on self-regulation,
  • Between the two three-month periods there was a 183% increase in the amount of posts removed that were labelled graphically violent. A total of 86% of these comments were flagged by a computer system.
  • But we also know that Facebook’s figures also show that up to 27 out of every 10,000 comments that made it past the detection technology contained graphic violence.
  • One estimate suggests that 510,000 comments are posted every minute. If accurate, that would mean 1,982,880 violent comments are posted every 24 hours.
  • Facebook has also used technology to aid the removal of graphic violence from its site.
  • This brings us to the other significant figure not included in the data released by Facebook: the total number of comments reported by users. As this is a fundamental mechanism in tackling online abuse, the amount of reports made to the company should be made publicly available
  • However, even Facebook still has a long way to go to get to total transparency. Ideally, all social networking sites would release annual reports on how they are tackling abuse online. This would enable regulators and the public to hold the firms more directly to account for failures to remove online abuse from their servers.
    • Carsten Ullrich
       
      my demand - standardized reporting
Carsten Ullrich

Article - 0 views

  • Internet Forum in 2015 in response to the alarming increase in the use of the Internet by terrorists to spread extremist propaganda
  • facilitation of a rapid and coordinated cross-border response mechanism to contain the spread of terrorist content online
  • Protocol only applies in exceptional situations, when national crisis management procedures prove insufficient.
Carsten Ullrich

The European Commission's Proposal for a Regulation on Preventing the Dissemination of ... - 0 views

  •  
    "10.30709/eucrim-2018-024"
Carsten Ullrich

The secret lives of Facebook moderators in America - The Verge - 0 views

  • It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let go.
  • The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”
  • The use of contract labor also has a practical benefit for Facebook: it is radically cheaper. The median Facebook employee earns $240,000 annually in salary, bonuses, and stock options. A content moderator working for Cognizant in Arizona, on the other hand, will earn just $28,800 per year. The arrangement helps Facebook maintain a high profit margin. In its most recent quarter, the company earned $6.9 billion in profits, on $16.9 billion in revenue. And while Zuckerberg had warned investors that Facebook’s investment in security would reduce the company’s profitability, profits were up 61 percent over the previous year.
  • ...3 more annotations...
  • Miguel takes a dim view of the accuracy figure. “Accuracy is only judged by agreement. If me and the auditor both allow the obvious sale of heroin, Cognizant was ‘correct,’ because we both agreed,” he says. “This number is fake.”
  • Even with an ever-changing rulebook, moderators are granted only the slimmest margins of error. The job resembles a high-stakes video game in which you start out with 100 points — a perfect accuracy score — and then scratch and claw to keep as many of those points as you can. Because once you fall below 95, your job is at risk. If a quality assurance manager marks Miguel’s decision wrong, he can appeal the decision. Getting the QA to agree with you is known as “getting the point back.” In the short term, an “error” is whatever a QA says it is, and so moderators have good reason to appeal every time they are marked wrong. (Recently, Cognizant made it even harder to get a point back, by requiring moderators to first get a SME to approve their appeal before it would be forwarded to the QA.)
  • eforeBefore Miguel can take a break, he clicks a browser extension to let Cognizant know he is leaving his desk. (“That’s a standard thing in this type of industry,” Facebook’s Davidson tells me. “To be able to track, so you know where your workforce is.”)
  •  
    "Pro Unlimited"
Carsten Ullrich

My Library - 0 views

  • that the elements which
  • re relevant for assessing whether the proprietor of an EU trade mark is entitled to prohibit the use of a sign in part of the European Union not covered by that action, may be taken into account by that court
  • Although, for the purpose of assessing whether Ornua is entitled to prohibit the use of the sign KERRYMAID in Spain, the referring court should consider taking into account elements present in Ireland and the United Kingdom, it should first of all ensure that there is no significant difference between the market conditions or the sociocultural circumstances
  • ...4 more annotations...
  • In that regard, account should be taken, in particular, of the overall presentation of the product marketed by the third party, the circumstances in which a distinction is made between that mark and the sign used by that the third party, and the effort made by that third party to ensure that consumers distinguish its products from those of which it is not the trade mark owner
  • in part of the European Union, an EU trade mark with a reputation and a sign peacefully coexist
  • It cannot be excluded that the conduct which can be expected of the third party so that its use of the sign follows honest practices in industrial or commercial matters must be analysed differently in a part of the European Union where consumers have a particular affinity with the geographical word contained in the mark and the sign at issue than in a part of the European Union where that affinity is weaker.
  • allows the conclusion that in another part of the European Union, where that peaceful coexistence is absent, there is due cause legitimising the use of that sign.
Carsten Ullrich

Article - 0 views

  • On 6 February 2020, the audiovisual regulator of the French-speaking community of Belgium (Conseil supérieur de l’audiovisuel – CSA) published a guidance note on the fight against certain forms of illegal Internet content, in particular hate speech
  • In the note, the CSA begins by summarising the current situation, highlighting the important role played by content-sharing platforms and their limited responsibility. It emphasises that some content can be harmful to young people in particular, whether they are the authors or victims of the content. It recognises that regulation, in its current form, is inappropriate and creates an imbalance between the regulation of online content-sharing platform operators, including social networks, and traditional players in the audiovisual sector
  • ould take its own legislative measures without waiting for work to start on an EU directive on the subject. 
  • ...6 more annotations...
  • f it advocates crimes against humanity; incites or advocates terrorist acts; or incites hatred, violence, discrimination or insults against a person or a group of people on grounds of origin, alleged race, religion, ethnic background, nationality, gender, sexual orientation, gender identity or disability, whether real or alleged.
  • obligations be imposed on the largest content-sharing platform operators, that is, any natural or legal person offering, on a professional basis, whether for remuneration or not, an online content-sharing platform, wherever it is based, used by at least 20% of the population of the French-speaking region of Belgium or the bilingual Brussels-Capital region.
  • iged to remove or block content notified to them that is ‘clearly illegal’ within 24 hours. T
  • need to put in place reporting procedures as well as processes for contesting their decisions
  • appoint an official contact person
  • half-yearly report on compliance with their obligation
1 - 7 of 7
Showing 20 items per page