Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged layer

Rss Feed Group items tagged

Bill Fulkerson

Anatomy of an AI System - 1 views

shared by Bill Fulkerson on 14 Sep 18 - No Cached
  •  
    "With each interaction, Alexa is training to hear better, to interpret more precisely, to trigger actions that map to the user's commands more accurately, and to build a more complete model of their preferences, habits and desires. What is required to make this possible? Put simply: each small moment of convenience - be it answering a question, turning on a light, or playing a song - requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. The scale of resources required is many magnitudes greater than the energy and labor it would take a human to operate a household appliance or flick a switch. A full accounting for these costs is almost impossible, but it is increasingly important that we grasp the scale and scope if we are to understand and govern the technical infrastructures that thread through our lives. III The Salar, the world's largest flat surface, is located in southwest Bolivia at an altitude of 3,656 meters above sea level. It is a high plateau, covered by a few meters of salt crust which are exceptionally rich in lithium, containing 50% to 70% of the world's lithium reserves. 4 The Salar, alongside the neighboring Atacama regions in Chile and Argentina, are major sites for lithium extraction. This soft, silvery metal is currently used to power mobile connected devices, as a crucial material used for the production of lithium-Ion batteries. It is known as 'grey gold.' Smartphone batteries, for example, usually have less than eight grams of this material. 5 Each Tesla car needs approximately seven kilograms of lithium for its battery pack. 6 All these batteries have a limited lifespan, and once consumed they are thrown away as waste. Amazon reminds users that they cannot open up and repair their Echo, because this will void the warranty. The Amazon Echo is wall-powered, and also has a mobile battery base. This also has a limited lifespan and then must be thrown away as waste. According to the Ay
Bill Fulkerson

2:00PM Water Cooler 8/28/2017 | naked capitalism - 0 views

  •  
    "Class Warfare "Towards a History of the Professional: On the Class Composition of the Research University" [Viewpoint Magazine]. From 2013, but it still looks interesting. By arrogating more power to the top layers of academic administrative elite, some in the academic profession saw the possibility of imbricating themselves into the same social class as capitalists, rather than simply serving them. Federal, state, and local laws changed to make students into consumers; courts ruled that public, non-profit universities could patent and own intellectual property; a new type of capital, venture capital, was developed to accelerate the transmission of research into products; and a sub-class of faculty, the adjunct, was formulated to teach the dregs of the expanding university system - those composing the massive undergraduate base, forced into higher education as a college degree became a de facto requirement for admission into any of the professions, and many other occupations. Graduate students and adjuncts took on the bulk of the teaching, freeing star faculty from the responsibility of lecturing to dullards for whom their words would be proverbial pearls before swine."
Bill Fulkerson

Optimization is as hard as approximation - Machine Learning Research Blog - 0 views

  •  
    Optimization is a key tool in machine learning, where the goal is to achieve the best possible objective function value in a minimum amount of time. Obtaining any form of global guarantees can usually be done with convex objective functions, or with special cases such as risk minimization with one-hidden over-parameterized layer neural networks (see the June post). In this post, I will consider low-dimensional problems (imagine 10 or 20), with no constraint on running time (thus get ready for some running-times that are exponential in dimension!).
Bill Fulkerson

Understanding the 'deep-carbon cycle' - 0 views

  •  
    New geologic findings about the makeup of the Earth's mantle are helping scientists better understand long-term climate stability and even how seismic waves move through the planet's layers.
Bill Fulkerson

Growth+Sales: The New Era of Enterprise Go-to-Market - Andreessen Horowitz - 0 views

  •  
    Most recent macro trends - cloud compute, social, mobile, crypto, AI - that have reshaped the technology landscape are rooted in new technical capabilities or pushing the frontier of product form factors. Another shock to the system is emerging today, driven not by the underlying technology, but by an evolution in customer buying behavior. For shorthand, we think of this trend as "growth+sales": the bottom-up growth motion eventually layered with top-down sales.
Bill Fulkerson

Scale and information-processing thresholds in Holocene social evolution | Nature Commu... - 0 views

  •  
    Throughout the Holocene, societies developed additional layers of administration and more information-rich instruments for managing and recording transactions and events as they grew in population and territory. Yet, while such increases seem inevitable, they are not. Here we use the Seshat database to investigate the development of hundreds of polities, from multiple continents, over thousands of years. We find that sociopolitical development is dominated first by growth in polity scale, then by improvements in information processing and economic systems, and then by further increases in scale. We thus define a Scale Threshold for societies, beyond which growth in information processing becomes paramount, and an Information Threshold, which once crossed facilitates additional growth in scale. Polities diverge in socio-political features below the Information Threshold, but reconverge beyond it. We suggest an explanation for the evolutionary divergence between Old and New World polities based on phased growth in scale and information processing. We also suggest a mechanism to help explain social collapses with no evident external causes.
Steve Bosserman

Gamification has a dark side - 0 views

  • Gamification is the application of game elements into nongame spaces. It is the permeation of ideas and values from the sphere of play and leisure to other social spaces. It’s premised on a seductive idea: if you layer elements of games, such as rules, feedback systems, rewards and videogame-like user interfaces over reality, it will make any activity motivating, fair and (potentially) fun. ‘We are starving and games are feeding us,’ writes Jane McGonigal in Reality Is Broken (2011). ‘What if we decided to use everything we know about game design to fix what’s wrong with reality?’
  • But gamification’s trapping of total fun masks that we have very little control over the games we are made to play – and hides the fact that these games are not games at all. Gamified systems are tools, not toys. They can teach complex topics, engage us with otherwise difficult problems. Or they can function as subtle systems of social control.
  • The problem of the gamified workplace goes beyond micromanagement. The business ethicist Tae Wan Kim at Carnegie Mellon University in Pittsburgh warns that gamified systems have the potential to complicate and subvert ethical reasoning. He cites the example of a drowning child. If you save the child, motivated by empathy, sympathy or goodwill – that’s a morally good act. But say you gamify the situation. Say you earn points for saving drowning children. ‘Your gamified act is ethically unworthy,’ he explained to me in an email. Providing extrinsic gamified motivators, even if they work as intended, deprive us of the option to live worthy lives, Kim argues. ‘The workplace is a sacred space where we develop ourselves and help others,’ he notes. ‘Gamified workers have difficulty seeing what contributions they really make.’
  • ...1 more annotation...
  • The 20th-century French philosopher Michel Foucault would have said that these are technologies of power. Today, the interface designer and game scholar Sebastian Deterding says that this kind of gamification expresses a modernist view of a world with top-down managerial control. But the concept is flawed. Gamification promises easy, centralised overviews and control. ‘It’s a comforting illusion because de facto reality is not as predictable as a simulation,’ Deterding says. You can make a model of a city in SimCity that bears little resemblance to a real city. Mistaking games for reality is ultimately mistaking map for territory. No matter how well-designed, a simulation cannot account for the unforeseen.
Steve Bosserman

High score, low pay: why the gig economy loves gamification | Business | The Guardian - 0 views

  • Simply defined, gamification is the use of game elements – point-scoring, levels, competition with others, measurable evidence of accomplishment, ratings and rules of play – in non-game contexts. Games deliver an instantaneous, visceral experience of success and reward, and they are increasingly used in the workplace to promote emotional engagement with the work process, to increase workers’ psychological investment in completing otherwise uninspiring tasks, and to influence, or “nudge”, workers’ behaviour.
  • According to Burawoy, production at Allied was deliberately organised by management to encourage workers to play the game. When work took the form of a game, Burawoy observed, something interesting happened: workers’ primary source of conflict was no longer with the boss. Instead, tensions were dispersed between workers (the scheduling man, the truckers, the inspectors), between operators and their machines, and between operators and their own physical limitations (their stamina, precision of movement, focus). The battle to beat the quota also transformed a monotonous, soul-crushing job into an exciting outlet for workers to exercise their creativity, speed and skill. Workers attached notions of status and prestige to their output, and the game presented them with a series of choices throughout the day, affording them a sense of relative autonomy and control. It tapped into a worker’s desire for self-determination and self-expression. Then, it directed that desire towards the production of profit for their employer.
  • Former Google “design ethicist” Tristan Harris has also described how the “pull-to-refresh” mechanism used in most social media feeds mimics the clever architecture of a slot machine: users never know when they are going to experience gratification – a dozen new likes or retweets – but they know that gratification will eventually come. This unpredictability is addictive: behavioural psychologists have long understood that gambling uses variable reinforcement schedules – unpredictable intervals of uncertainty, anticipation and feedback – to condition players into playing just one more round.
  • ...1 more annotation...
  • Gaming the game, Burawoy observed, allowed workers to assert some limited control over the labour process, and to “make out” as a result. In turn, that win had the effect of reproducing the players’ commitment to playing, and their consent to the rules of the game. When players were unsuccessful, their dissatisfaction was directed at the game’s obstacles, not at the capitalist class, which sets the rules. The inbuilt antagonism between the player and the game replaces, in the mind of the worker, the deeper antagonism between boss and worker. Learning how to operate cleverly within the game’s parameters becomes the only imaginable option. And now there is another layer interposed between labour and capital: the algorithm.
Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • But not everyone will be equally represented in that data.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
1 - 15 of 15
Showing 20 items per page