Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged blindness

Rss Feed Group items tagged

Bill Fulkerson

Why a 400-Year Program of Modernist Thinking is Exploding | naked capitalism - 0 views

  •  
    " Fearless commentary on finance, economics, politics and power Follow yvessmith on Twitter Feedburner RSS Feed RSS Feed for Comments Subscribe via Email SUBSCRIBE Recent Items Links 3/11/17 - 03/11/2017 - Yves Smith Deutsche Bank Tries to Stay Alive - 03/11/2017 - Yves Smith John Helmer: Australian Government Trips Up Ukrainian Court Claim of MH17 as Terrorism - 03/11/2017 - Yves Smith 2:00PM Water Cooler 3/10/2017 - 03/10/2017 - Lambert Strether Why a 400-Year Program of Modernist Thinking is Exploding - 03/10/2017 - Yves Smith Links 3/10/17 - 03/10/2017 - Yves Smith Why It Will Take a Lot More Than a Smartphone to Get the Sharing Economy Started - 03/10/2017 - Yves Smith CalPERS' General Counsel Railroads Board on Fiduciary Counsel Selection - 03/10/2017 - Yves Smith Another Somalian Famine - 03/10/2017 - Yves Smith Trade now with TradeStation - Highest rated for frequent traders Why a 400-Year Program of Modernist Thinking is Exploding Posted on March 10, 2017 by Yves Smith By Lynn Parramore, Senior Research Analyst at the Institute for New Economic Thinking. Originally published at the Institute for New Economic Thinking website Across the globe, a collective freak-out spanning the whole political system is picking up steam with every new "surprise" election, rush of tormented souls across borders, and tweet from the star of America's great unreality show, Donald Trump. But what exactly is the force that seems to be pushing us towards Armageddon? Is it capitalism gone wild? Globalization? Political corruption? Techno-nightmares? Rajani Kanth, a political economist, social thinker, and poet, goes beyond any of these explanations for the answer. In his view, what's throwing most of us off kilter - whether we think of ourselves as on the left or right, capitalist or socialist -was birthed 400 years ago during the period of the Enlightenment. It's a set of assumptions, a particular way of looking at the world that pushed out previous modes o
Bill Fulkerson

Datafication and ideological blindness - Cennydd Bowles - 0 views

  •  
    ""Our bodies break / And the blood just spills and spills / And here we sit debating math." -Retribution Gospel Choir, Breaker Design got its seat at the table, which is good because we can shut up about it now. What used to be seen as the territory of bespectacled Scandinavians is now a matter of HBR covers, consumer clamour, and 12-figure market caps. People in suits now talk about design as a way to differentiate products and unlock new markets."
Bill Fulkerson

Anatomy of an AI System - 1 views

shared by Bill Fulkerson on 14 Sep 18 - No Cached
  •  
    "With each interaction, Alexa is training to hear better, to interpret more precisely, to trigger actions that map to the user's commands more accurately, and to build a more complete model of their preferences, habits and desires. What is required to make this possible? Put simply: each small moment of convenience - be it answering a question, turning on a light, or playing a song - requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. The scale of resources required is many magnitudes greater than the energy and labor it would take a human to operate a household appliance or flick a switch. A full accounting for these costs is almost impossible, but it is increasingly important that we grasp the scale and scope if we are to understand and govern the technical infrastructures that thread through our lives. III The Salar, the world's largest flat surface, is located in southwest Bolivia at an altitude of 3,656 meters above sea level. It is a high plateau, covered by a few meters of salt crust which are exceptionally rich in lithium, containing 50% to 70% of the world's lithium reserves. 4 The Salar, alongside the neighboring Atacama regions in Chile and Argentina, are major sites for lithium extraction. This soft, silvery metal is currently used to power mobile connected devices, as a crucial material used for the production of lithium-Ion batteries. It is known as 'grey gold.' Smartphone batteries, for example, usually have less than eight grams of this material. 5 Each Tesla car needs approximately seven kilograms of lithium for its battery pack. 6 All these batteries have a limited lifespan, and once consumed they are thrown away as waste. Amazon reminds users that they cannot open up and repair their Echo, because this will void the warranty. The Amazon Echo is wall-powered, and also has a mobile battery base. This also has a limited lifespan and then must be thrown away as waste. According to the Ay
Bill Fulkerson

Cancel culture: the road to obscurantism - 0 views

  •  
    In her view, Ancient Greece's blind master storyteller, Homer, and his works, were guilty of "indulging and spreading sexism, racism, ableism, and Western-centrism". She came to the conclusion that canceling the classics seemed to be the most effective way to make sure that today's young generation could not, again in her view, be poisoned by the entirely fictional and mythical "sins" of Odysseus, Menelaus, and Priam.
Bill Fulkerson

Alarming COVID variants show vital role of genomic surveillance - 0 views

  •  
    Efforts to track SARS-CoV-2 sequences have helped identify worrying variants - but researchers are blind to emerging mutations in some regions.
Steve Bosserman

Applying AI for social good | McKinsey - 0 views

  • Artificial intelligence (AI) has the potential to help tackle some of the world’s most challenging social problems. To analyze potential applications for social good, we compiled a library of about 160 AI social-impact use cases. They suggest that existing capabilities could contribute to tackling cases across all 17 of the UN’s sustainable-development goals, potentially helping hundreds of millions of people in both advanced and emerging countries. Real-life examples of AI are already being applied in about one-third of these use cases, albeit in relatively small tests. They range from diagnosing cancer to helping blind people navigate their surroundings, identifying victims of online sexual exploitation, and aiding disaster-relief efforts (such as the flooding that followed Hurricane Harvey in 2017). AI is only part of a much broader tool kit of measures that can be used to tackle societal issues, however. For now, issues such as data accessibility and shortages of AI talent constrain its application for social good.
  • The United Nations’ Sustainable Development Goals (SDGs) are among the best-known and most frequently cited societal challenges, and our use cases map to all 17 of the goals, supporting some aspect of each one (Exhibit 3). Our use-case library does not rest on the taxonomy of the SDGs, because their goals, unlike ours, are not directly related to AI usage; about 20 cases in our library do not map to the SDGs at all. The chart should not be read as a comprehensive evaluation of AI’s potential for each SDG; if an SDG has a low number of cases, that reflects our library rather than AI’s applicability to that SDG.
Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • But not everyone will be equally represented in that data.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
1 - 20 of 22 Next ›
Showing 20 items per page