"It's not that Google wants to do this, it's that they didn't anticipate this outcome, and compounded that omission by likewise omitting a way to overrule the algorithm's judgment. As with other examples of algorithmic cruelty, it's not so much this specific example as was it presages for a future in which more and more of our external reality is determined by models derived from machine learning systems whose workings we're not privy to and have no say in. "
"But the documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its "trending module" headlines - the list of news topics that shows up on the side of the browser window on Facebook's desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users' feeds."
"Uber knows when your phone battery is running low because its app collects that information in order to switch into power-saving mode. But Chen swears Uber would never use that knowledge to gouge you out of more money.
"We absolutely don't use that to kind of like push you a higher surge price, but it's an interesting kind of psychological fact of human behavior," Chen said.
Uber's surge pricing uses a proprietary algorithm that accounts for how many users are hailing rides in an area at a given time. Customers are apparently less willing to believe that when the multiplier is a round number like 2.0 or 3.0, which seems more like it could have been arbitrarily made up by a human."
"Amazon's algorithms encourage customers pay more than they need to for popular products and appear to give more prominence to items that benefit the retail giant, according to an investigation by ProPublica.
The investigation looked at 250 frequently purchased products over several weeks to see which ones were chosen to appear in the highly-prized "buy box" that pops up first as a suggested purchase. "
"To demonstrate how reality may differ for different Facebook users, The Wall Street Journal created two feeds, one "blue" and the other "red." If a source appears in the red feed, a majority of the articles shared from the source were classified as "very conservatively aligned" in a large 2015 Facebook study. For the blue feed, a majority of each source's articles aligned "very liberal." These aren't intended to resemble actual individual news feeds. Instead, they are rare side-by-side looks at real conversations from different perspectives.
"
"A simple Google image search highlighted on Twitter has been said to highlight the pervasiveness of racial bias and media profiling.
"Three black teenagers" was a trending search on Google on Thursday after a US high school student pointed out the stark difference in results for "three black teenagers" and "three white teenagers"."