Skip to main content

Home/ UTS-AEI/ Group items tagged machine-learning

Rss Feed Group items tagged

Simon Knight

Data journalism's AI opportunity: the 3 different types of machine learning & how they ... - 0 views

  •  
    some examples of how the 3 types of machine learning - supervised, unsupervised, and reinforcement - have already been used for journalistic purposes, and using those to explain what those are along the way. Examples include: supervised learning to investigate doctors and sex abuse; unsurprivsed learning to identify motifs in Wes Anderson films; reinforcement learning to create a rock-paper-scissors that can beat you...
Simon Knight

The way we train AI is fundamentally flawed - MIT Technology Review - 0 views

  •  
    Roughly put, building a machine-learning model involves training it on a large number of examples and then testing it on a bunch of similar examples that it has not yet seen. When the model passes the test, you're done. What the Google researchers point out is that this bar is too low. The training process can produce many different models that all pass the test but-and this is the crucial part-these models will differ in small, arbitrary ways, depending on things like the random values given to the nodes in a neural network before training starts, the way training data is selected or represented, the number of training runs, and so on. These small, often random, differences are typically overlooked if they don't affect how a model does on the test. But it turns out they can lead to huge variation in performance in the real world. In other words, the process used to build most machine-learning models today cannot tell which models will work in the real world and which ones won't.
Simon Knight

Opinion | The Legislation That Targets the Racist Impacts of Tech - The New York Times - 1 views

  •  
    When creating a machine-learning algorithm, designers have to make many choices: what data to train it on, what specific questions to ask, how to use predictions that the algorithm produces. These choices leave room for discrimination, particularly against people who have been discriminated against in the past. For example, training an algorithm to select potential medical students on a data set that reflects longtime biases against women and people of color may make these groups less likely to be admitted. In computing, the phrase "garbage in, garbage out" describes how poor-quality input leads to poor-quality output. In this case we might say, "White male doctors in, white male doctors out."
Simon Knight

A Dataset is a Worldview - Towards Data Science - 0 views

  •  
    But because a machine learning model learns the boundaries of its world from its input data, just three people informed how any model using that dataset would interpret if 'childbirth' was emotional. This led to a perspective that has informed all of my work since: a dataset is a worldview. It encompasses the worldview of the people who scrape and collect the data, whether they're researchers, artists, or companies. It encompasses the worldview of the labelers, whether they labeled the data manually, unknowingly, or through a third party service like Mechanical Turk, which comes with its own demographic biases. It encompasses the worldview of the inherent taxonomies created by the organizers, which in many cases are corporations whose motives are directly incompatible with a high quality of life.
Simon Knight

Opinion | We Built an 'Unbelievable' (but Legal) Facial Recognition Machine - The New Y... - 0 views

  •  
    Most people pass through some type of public space in their daily routine - sidewalks, roads, train stations. Thousands walk through Bryant Park every day. But we generally think that a detailed log of our location, and a list of the people we're with, is private. Facial recognition, applied to the web of cameras that already exists in most cities, is a threat to that privacy. To demonstrate how easy it is to track people without their knowledge, we collected public images of people who worked near Bryant Park (available on their employers' websites, for the most part) and ran one day of footage through Amazon's commercial facial recognition service.
Simon Knight

Breaking the Black Box: What Facebook Knows About You - ProPublica - 0 views

  •  
    A series of short articles, with videos and browser addons "investigating algorithmic injustice and the formulas that influence our lives."
Simon Knight

How marketers use algorithms to (try to) read your mind - 0 views

  •  
    Have you ever you looked for a product online and then been recommended the exact thing you need to complement it? Or have you been thinking about a particular purchase, only to receive an email with that product on sale? All of this may give you a slightly spooky feeling, but what you're really experiencing is the result of complex algorithms used to predict, and in some cases, even influence your behaviour.
Simon Knight

Do computers make better bank managers than humans? - 0 views

  •  
    Algorithms are increasingly making decisions that affect ordinary people's lives. One example of this is so-called "algorithmic lending", with some companies claiming to have reduced the time it takes to approve a home loan to mere minutes. But can computers become better judges of financial risk than human bank tellers? Some computer scientists and data analysts certainly think so.
Simon Knight

Design of Hiring Algorithms Impacts Diversity | IndustryWeek - 0 views

  •  
    the use of historical data to train the AI gives 'a leg-up to people from groups who have traditionally been successful and grants fewer opportunities to minorities and women'.
1 - 9 of 9
Showing 20 items per page