Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged loan

Rss Feed Group items tagged

Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • But not everyone will be equally represented in that data.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
Steve Bosserman

Wanted: Factory Workers, Degree Required - The New York Times - 0 views

  • Struggling to fill jobs in the Charlotte plant, Siemens in 2011 created an apprenticeship program for seniors at local high schools that combines four years of on-the-job training with an associate degree in mechatronics from nearby Central Piedmont Community College. When they finish, graduates have no student loans and earn more than $50,000 a year.
Bill Fulkerson

Fed's massive 'Main Street' business rescue in danger of fizzling - POLITICO - 0 views

  •  
    Under the program, expected to be rolled out this week, companies will also face unwelcome curbs on stock buybacks, dividend payments and executive pay. And the sheer length of time it has taken to start the program - two months - has already forced many firms to seek alternatives. That has left industries divided, with manufacturers eager to tap the loans but retailers wanting more, as many businesses face the prospect of extensive layoffs or even bankruptcy.
Bill Fulkerson

Gender imbalanced datasets may affect the performance of AI pathology classifi... - 0 views

  •  
    Though it may not be common knowledge, AI systems are currently being used in a wide variety of commercial applications, including article selection on news and social media sites, which movies get made,and maps that appear on our phones-AI systems have become trusted tools by big business. But their use has not always been without controversy. In recent years, researchers have found that AI apps used to approve mortgage and other loan applications are biased, for example, in favor of white males. This, researchers found, was because the dataset used to train the system mostly comprised white male profiles. In this new effort, the researchers wondered if the same might be true for AI systems used to assist doctors in diagnosing patients.
1 - 8 of 8
Showing 20 items per page