Skip to main content

Home/ ITGSonline/ Group items tagged policiesandstandards algorithm

Rss Feed Group items tagged

dr tech

Ethics committee raises alarm over 'predictive policing' tool | UK news | The Guardian - 0 views

  •  
    "Amid mounting financial pressure, at least a dozen police forces are using or considering predictive analytics, despite warnings from campaigners that use of algorithms and "predictive policing" models risks locking discrimination into the criminal justice system."
dr tech

The EU's plan for algorithmic copyright filters is looking more and more unlikely / Boi... - 0 views

  •  
    "Under the proposal, online platforms would have to spend hundreds of millions of euros on algorithmic copyright filters that would compare everything users tried to post with a database of supposedly copyrighted works, which anyone could add anything to, and block any suspected matches. This would snuff out all the small EU competitors to America's Big Tech giants, and put all Europeans' communications under threat of arbitrary censorship by balky, unaccountable, easily abused algorithms."
dr tech

The airline Ryanair uses algorithms to split up families, study indicates - Vox - 0 views

  •  
    "In a survey of more than 4,200 people conducted by CAA, travelers most frequently cited being split from their party while traveling on Ryanair, but the airline insists that it doesn't employ a family-splitting algorithm. Ryanair says if a person doesn't pay for their seat assignment, they are "randomly" assigned, which may result in them not sitting with their party."
dr tech

The terrifying, hidden reality of Ridiculously Complicated Algorithms - 0 views

  •  
    ""Weapons of math destruction" is how the writer Cathy O'Neil describes the nasty and pernicious kinds of algorithms that are not subject to the same challenges that human decision-makers are. Parole algorithms (not Jure's) can bias decisions on the basis of income or (indirectly) ethnicity. Recruitment algorithms can reject candidates on the basis of mistaken identity. In some circumstances, such as policing, they might create feedback loops, sending police into areas with more crime, which causes more crime to be detected."
dr tech

Blue Feed, Red Feed - WSJ.com - 0 views

  •  
    "To demonstrate how reality may differ for different Facebook users, The Wall Street Journal created two feeds, one "blue" and the other "red." If a source appears in the red feed, a majority of the articles shared from the source were classified as "very conservatively aligned" in a large 2015 Facebook study. For the blue feed, a majority of each source's articles aligned "very liberal." These aren't intended to resemble actual individual news feeds. Instead, they are rare side-by-side looks at real conversations from different perspectives. "
dr tech

'Three black teenagers': anger as Google image search shows police mugshots | Technolog... - 0 views

  •  
    "A simple Google image search highlighted on Twitter has been said to highlight the pervasiveness of racial bias and media profiling. "Three black teenagers" was a trending search on Google on Thursday after a US high school student pointed out the stark difference in results for "three black teenagers" and "three white teenagers"."
dr tech

Uber knows you're more likely to pay surge prices when your phone is dying - 0 views

  •  
    "Uber knows when your phone battery is running low because its app collects that information in order to switch into power-saving mode. But Chen swears Uber would never use that knowledge to gouge you out of more money. "We absolutely don't use that to kind of like push you a higher surge price, but it's an interesting kind of psychological fact of human behavior," Chen said. Uber's surge pricing uses a proprietary algorithm that accounts for how many users are hailing rides in an area at a given time. Customers are apparently less willing to believe that when the multiplier is a round number like 2.0 or 3.0, which seems more like it could have been arbitrarily made up by a human."
dr tech

Algorithmic cruelty: when Gmail adds your harasser to your speed-dial / Boing Boing - 0 views

  •  
    "It's not that Google wants to do this, it's that they didn't anticipate this outcome, and compounded that omission by likewise omitting a way to overrule the algorithm's judgment. As with other examples of algorithmic cruelty, it's not so much this specific example as was it presages for a future in which more and more of our external reality is determined by models derived from machine learning systems whose workings we're not privy to and have no say in. "
1 - 9 of 9
Showing 20 items per page