Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged data mining

Rss Feed Group items tagged

Bill Fulkerson

Anatomy of an AI System - 1 views

shared by Bill Fulkerson on 14 Sep 18 - No Cached
  •  
    "With each interaction, Alexa is training to hear better, to interpret more precisely, to trigger actions that map to the user's commands more accurately, and to build a more complete model of their preferences, habits and desires. What is required to make this possible? Put simply: each small moment of convenience - be it answering a question, turning on a light, or playing a song - requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. The scale of resources required is many magnitudes greater than the energy and labor it would take a human to operate a household appliance or flick a switch. A full accounting for these costs is almost impossible, but it is increasingly important that we grasp the scale and scope if we are to understand and govern the technical infrastructures that thread through our lives. III The Salar, the world's largest flat surface, is located in southwest Bolivia at an altitude of 3,656 meters above sea level. It is a high plateau, covered by a few meters of salt crust which are exceptionally rich in lithium, containing 50% to 70% of the world's lithium reserves. 4 The Salar, alongside the neighboring Atacama regions in Chile and Argentina, are major sites for lithium extraction. This soft, silvery metal is currently used to power mobile connected devices, as a crucial material used for the production of lithium-Ion batteries. It is known as 'grey gold.' Smartphone batteries, for example, usually have less than eight grams of this material. 5 Each Tesla car needs approximately seven kilograms of lithium for its battery pack. 6 All these batteries have a limited lifespan, and once consumed they are thrown away as waste. Amazon reminds users that they cannot open up and repair their Echo, because this will void the warranty. The Amazon Echo is wall-powered, and also has a mobile battery base. This also has a limited lifespan and then must be thrown away as waste. According to the Ay
Steve Bosserman

The wealth of our collective data should belong to all of us | Chris Hughes - 0 views

  • Nearly every moment of our lives, we’re producing data about ourselves that companies profit from. Our smartwatches know when we wake up, Alexa listens to our private conversations, our phones track where we go, Google knows what we email and search, Facebook knows what we share with friends, and our loyalty cards remember what we buy. We share all this data about ourselves because we like the services these companies provide, and business leaders tell us we must to make it possible for those services to be cheap or free.
  • We should not only expect that these companies better protect our data – we should also ensure that everyone creating it shares in the economic value it generates. One person’s data is worth little, but the collection of lots of people’s data is what fuels the insights that companies use to make more money or networks, like Facebook, that marketers are so attracted to. Data isn’t the “new oil”, as some have claimed: it isn’t a non-renewable natural resource that comes from a piece of earth that a lucky property owner controls. We have all pitched in to create a new commonwealth of information about ourselves that is bigger than any single participant, and we should all benefit from it.
  • The value of our data has a lot in common with the value of our labor: a single individual worker, outside of the rarest professions, can be replaced by another with similar skills. But when workers organize to withhold their labor, they have much more power to ensure employers more fairly value it. Just as one worker is an island but organized workers are a force to be reckoned with, the users of digital platforms should organize not only for better protection of our data, but for a new contract that ensures everyone shares in the historic profits we make possible.
  • ...2 more annotations...
  • A data dividend would be a powerful way to rebalance the American economy, which currently makes it possible for a very small number of people to get rich while everyone else struggles to make ends meet.
  • A data dividend on its own would not be enough to stem growing income inequality, but it would create a universal benefit that would guarantee people benefit from the collective wealth our economy is creating more than they do today. If paired with fairer wages, more progressive taxation, and stricter enforcement of monopoly and monopsony power, it could help us turn the corner and create a country where we take care of one another and ensure that everyone has basic economic security.
Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • But not everyone will be equally represented in that data.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
Bill Fulkerson

Full article: Re-assembling the surveillable refugee body in the era of data-craving - 0 views

  •  
    This article traces the travel of biometric data of Syrian refugees in Jordan through a hastily evolving political economy characterized by a pervasive craving for the extraction, storage and brokering of displacement data. It analyzes iris-enrollment as problematic acts of quasi-citizenship for the displaced requiring the performance of social and economic docility in order to attain identity, cash and service provision. Quasi-objects in the form of digital footprints are fashioned through infrastructures that simultaneously seek to model, yet fail to capture, socioeconomic existence in displacement contexts. Discourses of anti-fraud, donor dictates, upward accountability and strategies of financial inclusion of 'the unbanked', facilitate the marketization of the creation of data-doubles in laboratories of displacement and loopholes for externalization. Driven by increasingly blurred lines between technological, humanitarian and financial interests, this development has transformative effects on both those displaced, and on a humanitarian sector tasked with safeguarding their rights.
Steve Bosserman

'Forget the Facebook leak': China is mining data directly from workers' brains on an in... - 0 views

  • The technology is in widespread use around the world but China has applied it on an unprecedented scale in factories, public transport, state-owned companies and the military to increase the competitiveness of its manufacturing industry and to maintain social stability.
  • Jin said that at present China’s brain-reading technology was on a par with that in the West but China was the only country where there had been reports of massive use of the technology in the workplace.
  • With improved speed and sensitivity, the device could even become a “mental keyboard” allowing the user to control a computer or mobile phone with their mind.
  • ...3 more annotations...
  • Qiao Zhian, professor of management psychology at Beijing Normal University, said that while the devices could make businesses more competitive the technology could also be abused by companies to control minds and infringe privacy, raising the spectre of “thought police”.
  • “There is no law or regulation to limit the use of this kind of equipment in China. The employer may have a strong incentive to use the technology for higher profit, and the employees are usually in too weak a position to say no,” he said.
  • Lawmakers should act now to limit the use of emotion surveillance and give workers more bargaining power to protect their interests, Qiao said. “The human mind should not be exploited for profit,” he said.
Steve Bosserman

Science has outgrown the human mind and its limited capacities | Aeon Ideas - 0 views

  • Human minds simply cannot reconstruct highly complex natural phenomena efficiently enough in the age of big data. A modern Baconian method that incorporates reductionist ideas through data-mining, but then analyses this information through inductive computational models, could transform our understanding of the natural world. Such an approach would enable us to generate novel hypotheses that have higher chances of turning out to be true, to test those hypotheses, and to fill gaps in our knowledge. It would also provide a much-needed reminder of what science is supposed to be: truth-seeking, anti-authoritarian, and limitlessly free.
Steve Bosserman

Only governments can safeguard the openness of the internet - Rufus Pollock | Aeon Ideas - 0 views

  • The internet’s low-cost transmission can just as easily create information empires and robber barons as it can digital democracy and information equality. The growing value of being able to mine and manipulate huge data-sets, to generate predictions about consumers’ behaviour and desires, creates a self-reinforcing spiral of network effects. Data begets more data, locked down behind each company’s walls where their proprietary algorithms can exploit it for profit.
  • Today, the equivalent gesture might be to turn away from private monopolies to fund innovation and creativity. What matters is who owns information, not just the infrastructure by which it is distributed. Digital technology must be combined with concrete actions that protect openness across the spectrum, from maps to medicines, from software to schools. Better that we do it through public institutions, instead of relying on mavericks and martyrs.
Bill Fulkerson

Do algorithms discriminate - 0 views

  •  
    As the executive and academic director of a leadership center, my research indicates that relying on data analytics to eliminate human bias in choosing leaders won't help.
Steve Bosserman

60 Minutes: Facial and emotional recognition; how one man is advancing artificial intel... - 0 views

  • Basically chauffeurs, truck drivers anyone who does driving for a living their jobs will be disrupted more in the 15 to 20 year time frame and many jobs that seem a little bit complex, chef, waiter, a lot of things will become automated we'll have automated stores, automated restaurants, and all together in 15 years, that's going to displace about 40 percent of the jobs in the world.
  • Because I believe in the sanctity of our soul. I believe there is a lot of things about us that we don't understand. I believe there's a lot of love and compassion that is not explainable in terms of neural networks and computation algorithms. And I currently see no way of solving them. Obviously, unsolved problems have been solved in the past. But it would be irresponsible for me to predict that these will be solved by a certain timeframe.
1 - 12 of 12
Showing 20 items per page