The coming automatic, freaky, contextual world and why we're writing a book about it - ... - 0 views
-
A new world is coming. It's scary. Freaky. Over the freaky line, if you will. But it is coming. Investors like Ron Conway and Marc Andreessen are investing in it. Companies from Google to startups you've never heard of, like Wovyn or Highlight, are building it. With more than a couple of new ones already on the way that you'll hear about over the next six months.
Do Google's 'unprofessional hair' results show it is racist? | Technology | The Guardian - 0 views
Google's Project Tango reveals location-aware phone | Technology | theguardian.com - 0 views
Mark Zuckerberg, Let Me Pay for Facebook - NYTimes.com - 0 views
-
"FACEBOOK. Instagram. Google. Twitter. All services we rely on - and all services we believe we don't have to pay for. Not with cash, anyway. But ad-financed Internet platforms aren't free, and the price they extract in terms of privacy and control is getting only costlier. A recent Pew Research Center poll shows that 93 percent of the public believes that "being in control of who can get information about them is important," and yet the amount of information we generate online has exploded and we seldom know where it all goes."
[1607.06520] Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Emb... - 0 views
-
The blind application of machine learning runs the risk of amplifying biases present in data. Such a danger is facing us with word embedding, a popular framework to represent text data as vectors which has been used in many machine learning and natural language processing tasks. We show that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent. This raises concerns because their widespread use, as we describe, often tends to amplify these biases. Geometrically, gender bias is first shown to be captured by a direction in the word embedding. Second, gender neutral words are shown to be linearly separable from gender definition words in the word embedding. Using these properties, we provide a methodology for modifying an embedding to remove gender stereotypes, such as the association between between the words receptionist and female, while maintaining desired associations such as between the words queen and female. We define metrics to quantify both direct and indirect gender biases in embeddings, and develop algorithms to "debias" the embedding. Using crowd-worker evaluation as well as standard benchmarks, we empirically demonstrate that our algorithms significantly reduce gender bias in embeddings while preserving the its useful properties such as the ability to cluster related concepts and to solve analogy tasks. The resulting embeddings can be used in applications without amplifying gender bias.
Probing the Dark Side of Google's Ad-Targeting System - MIT Technology Review - 0 views
board.net - 0 views
Google's Nest div flings HALF an INSTAGRAM at Dropcam buyout * The Register - 1 views
Lawyers think ICO should have penalised Royal Free and DeepMind - Business Insider - 0 views
‹ Previous
21 - 40 of 41
Next ›
Showing 20▼ items per page