Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged neural

Rss Feed Group items tagged

Bill Fulkerson

Anatomy of an AI System - 1 views

shared by Bill Fulkerson on 14 Sep 18 - No Cached
  •  
    "With each interaction, Alexa is training to hear better, to interpret more precisely, to trigger actions that map to the user's commands more accurately, and to build a more complete model of their preferences, habits and desires. What is required to make this possible? Put simply: each small moment of convenience - be it answering a question, turning on a light, or playing a song - requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. The scale of resources required is many magnitudes greater than the energy and labor it would take a human to operate a household appliance or flick a switch. A full accounting for these costs is almost impossible, but it is increasingly important that we grasp the scale and scope if we are to understand and govern the technical infrastructures that thread through our lives. III The Salar, the world's largest flat surface, is located in southwest Bolivia at an altitude of 3,656 meters above sea level. It is a high plateau, covered by a few meters of salt crust which are exceptionally rich in lithium, containing 50% to 70% of the world's lithium reserves. 4 The Salar, alongside the neighboring Atacama regions in Chile and Argentina, are major sites for lithium extraction. This soft, silvery metal is currently used to power mobile connected devices, as a crucial material used for the production of lithium-Ion batteries. It is known as 'grey gold.' Smartphone batteries, for example, usually have less than eight grams of this material. 5 Each Tesla car needs approximately seven kilograms of lithium for its battery pack. 6 All these batteries have a limited lifespan, and once consumed they are thrown away as waste. Amazon reminds users that they cannot open up and repair their Echo, because this will void the warranty. The Amazon Echo is wall-powered, and also has a mobile battery base. This also has a limited lifespan and then must be thrown away as waste. According to the Ay
Bill Fulkerson

Fooling Neural Networks in the Physical World with 3D Adversarial Objects · l... - 0 views

  •  
    "We've developed an approach to generate 3D adversarial objects that reliably fool neural networks in the real world, no matter how the objects are looked at."
Steve Bosserman

Why we find change so difficult, according to neuroscience - 0 views

  • “Emotionally and cognitively and executively the brain has established a lot of pathways,” says Dr. Sanam Hafeez, a licensed clinical psychologist and neuropsychologist. “The more you do something the more ingrained it becomes in neural pathways, much like how a computer that stores the sites you visit — when you log onto your browser, they will pop up because you use them a lot. Change is an upheaval of many things and the brain has to work to fit it into an existing framework.”
  • “You absolutely can and should teach your brain to change,” says Hafeez, noting that keeping the brain agile has been shown to help delay aging. “I've done quite a bit of work on the aging process and slowing that down. It starts with changing the aversion to change.”
  • “Let’s say you’re a financial planer who takes up knitting,” says Hafeez. “That is doing something very different, where the brain truly has to adapt new neural pathways. Learning a new skill like this have been shown to ward off dementia, aging and cognitive decline because it regenerates cellular activity. Learn a new language in middle age. You tax your brain by shaking things up and it’s effective for your body in the way HIIT is for your body.”
  • ...1 more annotation...
  • “Most people won't try something new because they’re deathly afraid of failing,” notes Hafeez. “When you see that something is doable it makes you more receptive and brave. There's that emotional, therapeutic factor that is separate from the neural pathway factor. Over the years, we learn to succeed by viewing our previous failures and successes in a certain light and as we get older we lose sight of that. When you try a new thing it makes you more confident to try to do more new things.”
Bill Fulkerson

Optimization is as hard as approximation - Machine Learning Research Blog - 0 views

  •  
    Optimization is a key tool in machine learning, where the goal is to achieve the best possible objective function value in a minimum amount of time. Obtaining any form of global guarantees can usually be done with convex objective functions, or with special cases such as risk minimization with one-hidden over-parameterized layer neural networks (see the June post). In this post, I will consider low-dimensional problems (imagine 10 or 20), with no constraint on running time (thus get ready for some running-times that are exponential in dimension!).
Steve Bosserman

60 Minutes: Facial and emotional recognition; how one man is advancing artificial intel... - 0 views

  • Basically chauffeurs, truck drivers anyone who does driving for a living their jobs will be disrupted more in the 15 to 20 year time frame and many jobs that seem a little bit complex, chef, waiter, a lot of things will become automated we'll have automated stores, automated restaurants, and all together in 15 years, that's going to displace about 40 percent of the jobs in the world.
  • Because I believe in the sanctity of our soul. I believe there is a lot of things about us that we don't understand. I believe there's a lot of love and compassion that is not explainable in terms of neural networks and computation algorithms. And I currently see no way of solving them. Obviously, unsolved problems have been solved in the past. But it would be irresponsible for me to predict that these will be solved by a certain timeframe.
Steve Bosserman

Area of the brain that processes empathy identified - 0 views

  • According to Dr. Gu, this study provides the first evidence suggesting that the empathy deficits in patients with brain damage to the anterior insular cortex are surprisingly similar to the empathy deficits found in several psychiatric diseases, including autism spectrum disorders, borderline personality disorder, schizophrenia, and conduct disorders, suggesting potentially common neural deficits in those psychiatric populations.
Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • But not everyone will be equally represented in that data.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
1 - 18 of 18
Showing 20 items per page