Skip to main content

Home/ GAVNet Collaborative Curation/ Group items tagged systems

Rss Feed Group items tagged

Bill Fulkerson

Anatomy of an AI System - 1 views

shared by Bill Fulkerson on 14 Sep 18 - No Cached
  •  
    "With each interaction, Alexa is training to hear better, to interpret more precisely, to trigger actions that map to the user's commands more accurately, and to build a more complete model of their preferences, habits and desires. What is required to make this possible? Put simply: each small moment of convenience - be it answering a question, turning on a light, or playing a song - requires a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. The scale of resources required is many magnitudes greater than the energy and labor it would take a human to operate a household appliance or flick a switch. A full accounting for these costs is almost impossible, but it is increasingly important that we grasp the scale and scope if we are to understand and govern the technical infrastructures that thread through our lives. III The Salar, the world's largest flat surface, is located in southwest Bolivia at an altitude of 3,656 meters above sea level. It is a high plateau, covered by a few meters of salt crust which are exceptionally rich in lithium, containing 50% to 70% of the world's lithium reserves. 4 The Salar, alongside the neighboring Atacama regions in Chile and Argentina, are major sites for lithium extraction. This soft, silvery metal is currently used to power mobile connected devices, as a crucial material used for the production of lithium-Ion batteries. It is known as 'grey gold.' Smartphone batteries, for example, usually have less than eight grams of this material. 5 Each Tesla car needs approximately seven kilograms of lithium for its battery pack. 6 All these batteries have a limited lifespan, and once consumed they are thrown away as waste. Amazon reminds users that they cannot open up and repair their Echo, because this will void the warranty. The Amazon Echo is wall-powered, and also has a mobile battery base. This also has a limited lifespan and then must be thrown away as waste. According to the Ay
Bill Fulkerson

Why a 400-Year Program of Modernist Thinking is Exploding | naked capitalism - 0 views

  •  
    " Fearless commentary on finance, economics, politics and power Follow yvessmith on Twitter Feedburner RSS Feed RSS Feed for Comments Subscribe via Email SUBSCRIBE Recent Items Links 3/11/17 - 03/11/2017 - Yves Smith Deutsche Bank Tries to Stay Alive - 03/11/2017 - Yves Smith John Helmer: Australian Government Trips Up Ukrainian Court Claim of MH17 as Terrorism - 03/11/2017 - Yves Smith 2:00PM Water Cooler 3/10/2017 - 03/10/2017 - Lambert Strether Why a 400-Year Program of Modernist Thinking is Exploding - 03/10/2017 - Yves Smith Links 3/10/17 - 03/10/2017 - Yves Smith Why It Will Take a Lot More Than a Smartphone to Get the Sharing Economy Started - 03/10/2017 - Yves Smith CalPERS' General Counsel Railroads Board on Fiduciary Counsel Selection - 03/10/2017 - Yves Smith Another Somalian Famine - 03/10/2017 - Yves Smith Trade now with TradeStation - Highest rated for frequent traders Why a 400-Year Program of Modernist Thinking is Exploding Posted on March 10, 2017 by Yves Smith By Lynn Parramore, Senior Research Analyst at the Institute for New Economic Thinking. Originally published at the Institute for New Economic Thinking website Across the globe, a collective freak-out spanning the whole political system is picking up steam with every new "surprise" election, rush of tormented souls across borders, and tweet from the star of America's great unreality show, Donald Trump. But what exactly is the force that seems to be pushing us towards Armageddon? Is it capitalism gone wild? Globalization? Political corruption? Techno-nightmares? Rajani Kanth, a political economist, social thinker, and poet, goes beyond any of these explanations for the answer. In his view, what's throwing most of us off kilter - whether we think of ourselves as on the left or right, capitalist or socialist -was birthed 400 years ago during the period of the Enlightenment. It's a set of assumptions, a particular way of looking at the world that pushed out previous modes o
Steve Bosserman

The Fourth Industrial Revolution: Proceedings of a Workshop-in Brief - 0 views

  •  
    The Forum's perspective on present and future technological and societal changes is captured in their 'Principled Framework for the Fourth Industrial Revolution.' Philbeck explained the four principles that characterize the Fourth Industrial Revolution. * Think systems, not technologies. Individual technologies are interesting, but it is their systemic impact that matters. Emerging technologies challenge our societal values and norms, sometimes for good, but sometimes also in negative ways; the Fourth Industrial Revolution will have civilization-changing impact-on species, on the planet, on geopolitics, and on the global economy. Philbeck suggested that wealth creation and aggregation supported by this phase of technological innovation may challenge societal commitments to accessibility, inclusivity, and fairness and create the need for relentless worker re-education. As Philbeck stated, "The costs for greater productivity are often externalized to stakeholders who are not involved in a particular technology's development." * Empowering, not determining. The Forum urges an approach to the Fourth Industrial Revolution that honors existing social principles. "We need to take a stance toward technology and technological systems that empowers society and acts counter to fatalistic and deterministic views, so that society and its agency is not nullified," said Philbeck. "Technologies are not forces; we have the ability to shape them and decide on how they are applied." * Future by design, and not by default. Seeking a future by design requires active governance. There are many types of governance-by individuals, by governments, by civic society, and by companies. Philbeck argued that failure to pay attention to critical governance questions in consideration of the Fourth Industrial Revolution means societies are likely to allow undemocratic, random, and potentially malicious forces to shape the future of technological systems and th
Bill Fulkerson

How Complex Web Systems Fail - Part 2 - Production Ready - Medium - 0 views

  •  
    "In his influential paper How Complex Systems Fail, Richard Cook shares 18 brilliant observations on the nature of failure in complex systems. Part 1 of this article was my attempt to translate the first nine of his observations into the context of web systems, i.e., the distributed systems behind modern web applications. In this second and final part, I'm going to complete the picture and cover the other half of Cook's paper. So let's get started with observation #10!"
Bill Fulkerson

Systems | Free Full-Text | Developing a Preliminary Causal Loop Diagram for Understandi... - 0 views

  •  
    COVID-19 is a wicked problem for policy makers internationally as the complexity of the pandemic transcends health, environment, social and economic boundaries. Many countries are focusing on two key responses, namely virus containment and financial measures, but fail to recognise other aspects. The systems approach, however, enables policy makers to design the most effective strategies and reduce the unintended consequences. To achieve fundamental change, it is imperative to firstly identify the "right" interventions (leverage points) and implement additional measures to reduce negative consequences. To do so, a preliminary causal loop diagram of the COVID-19 pandemic was designed to explore its influence on socio-economic systems. In order to transcend the "wait and see" approach, and create an adaptive and resilient system, governments need to consider "deep" leverage points that can be realistically maintained over the long-term and cause a fundamental change, rather than focusing on "shallow" leverage points that are relatively easy to implement but do not result in significant systemic change
Steve Bosserman

Families and children in the next system - 0 views

  •  
    "This paper explores the current deficiencies in the way the United States supports children and families and shows how a new economic model-a "next system," based in part on the best practices currently in use somewhere in the world-might better provide for flourishing families and children, both in the United States and around the world. We consider current policies, both international and domestic, that seem to provide the best results in today's global economic system. We then suggest how these "best practices" might be incorporated into a larger model for family and child well-being in "the next system." We proceed to consider new ideas not yet implemented that can improve outcomes, and, finally, suggest some pragmatic, step-by-step strategies for moving toward a world that offers the best possible outcomes for all." https://thenextsystem.org/learn/stories/families-and-children-next-system
Bill Fulkerson

How Complex Systems Fail | the morning paper - 0 views

  •  
    "This is a wonderfully short and easy to read paper looking at how complex systems fail - it's written by a Doctor (MD) in the context of systems of patient care, but that makes it all the more fun to translate the lessons into complex IT systems, including their human operator components. The paper consists of 18 observations. Here are some of my favourites…."
Steve Bosserman

How We Made AI As Racist and Sexist As Humans - 0 views

  • Artificial intelligence may have cracked the code on certain tasks that typically require human smarts, but in order to learn, these algorithms need vast quantities of data that humans have produced. They hoover up that information, rummage around in search of commonalities and correlations, and then offer a classification or prediction (whether that lesion is cancerous, whether you’ll default on your loan) based on the patterns they detect. Yet they’re only as clever as the data they’re trained on, which means that our limitations—our biases, our blind spots, our inattention—become theirs as well.
  • The majority of AI systems used in commercial applications—the ones that mediate our access to services like jobs, credit, and loans— are proprietary, their algorithms and training data kept hidden from public view. That makes it exceptionally difficult for an individual to interrogate the decisions of a machine or to know when an algorithm, trained on historical examples checkered by human bias, is stacked against them. And forget about trying to prove that AI systems may be violating human rights legislation.
  • Data is essential to the operation of an AI system. And the more complicated the system—the more layers in the neural nets, to translate speech or identify faces or calculate the likelihood someone defaults on a loan—the more data must be collected.
  • ...8 more annotations...
  • But not everyone will be equally represented in that data.
  • And sometimes, even when ample data exists, those who build the training sets don’t take deliberate measures to ensure its diversity
  • The power of the system is its “ability to recognize that correlations occur between gender and professions,” says Kathryn Hume. “The downside is that there’s no intentionality behind the system—it’s just math picking up on correlations. It doesn’t know this is a sensitive issue.” There’s a tension between the futuristic and the archaic at play in this technology. AI is evolving much more rapidly than the data it has to work with, so it’s destined not just to reflect and replicate biases but also to prolong and reinforce them.
  • Accordingly, groups that have been the target of systemic discrimination by institutions that include police forces and courts don’t fare any better when judgment is handed over to a machine.
  • A growing field of research, in fact, now looks to apply algorithmic solutions to the problems of algorithmic bias.
  • Still, algorithmic interventions only do so much; addressing bias also demands diversity in the programmers who are training machines in the first place.
  • A growing awareness of algorithmic bias isn’t only a chance to intervene in our approaches to building AI systems. It’s an opportunity to interrogate why the data we’ve created looks like this and what prejudices continue to shape a society that allows these patterns in the data to emerge.
  • Of course, there’s another solution, elegant in its simplicity and fundamentally fair: get better data.
Bill Fulkerson

Gender imbalanced datasets may affect the performance of AI pathology classifi... - 0 views

  •  
    Though it may not be common knowledge, AI systems are currently being used in a wide variety of commercial applications, including article selection on news and social media sites, which movies get made,and maps that appear on our phones-AI systems have become trusted tools by big business. But their use has not always been without controversy. In recent years, researchers have found that AI apps used to approve mortgage and other loan applications are biased, for example, in favor of white males. This, researchers found, was because the dataset used to train the system mostly comprised white male profiles. In this new effort, the researchers wondered if the same might be true for AI systems used to assist doctors in diagnosing patients.
Steve Bosserman

Dark times call for brighter new visions of the world we want to see - The Next System ... - 0 views

  • The failure of traditional politics reflects the failure of our political-economic system and our traditional strategies. The old ways simply no longer work. Systemic crisis means that systemic change is ultimately necessary – no matter how difficult it may be. Beginning with community and building up, from diverse directions and working together in a new politics and a practical strategy of system-wide change ultimately offers the only serious response to a crisis of system-wide dimensions.
Bill Fulkerson

Adaptation to low parasite abundance affects immune investment and immunopathological r... - 0 views

  •  
    Using two independent single-cell approaches, we identified a shift in the overall immune cell composition in cavefish as the underlying cellular mechanism, indicating strong differences in the immune investment strategy. While surface fish invest evenly into the innate and adaptive immune systems, cavefish shifted immune investment to the adaptive immune system, and here, mainly towards specific T-cell populations that promote homeostasis. Additionally, inflammatory responses and immunopathological phenotypes in visceral adipose tissue are drastically reduced in cavefish. Our data indicate that long-term adaptation to low parasite diversity coincides with a more sensitive immune system in cavefish, which is accompanied by a reduction in the immune cells that play a role in mediating the pro-inflammatory response.
Bill Fulkerson

The science and medicine of human immunology | Science - 0 views

  •  
    The coronavirus disease 2019 (COVID-19) pandemic has underscored the critical need to better understand the human immune system and how to unleash its power to develop vaccines and therapeutics. Much of our knowledge of the immune system has accrued from studies in mice, yet vaccines and drugs that work effectively in mice do not always translate into humans. Pulendran and Davis review recent technological advances that have facilitated the study of the immune system in humans. They discuss new insights and how these can affect the development of drugs and vaccines in the modern era.
Steve Bosserman

The Blockchain Energy System Is Going To Be Great For Consumers | Co.Exist | ideas + im... - 0 views

  • Martin argues that the decentralized blockchain system, with its string of trusted nodes all over the world, could align with the decentralized energy system to create something really new. Essentially, the blockchain could complete the job of solar panels in allowing people to sell energy at the price they want and maintain rights to their power whenever they need it. (We covered some other blockchain energy projects here).
Steve Bosserman

Are You Creditworthy? The Algorithm Will Decide. - 0 views

  • The decisions made by algorithmic credit scoring applications are not only said to be more accurate in predicting risk than traditional scoring methods; its champions argue they are also fairer because the algorithm is unswayed by the racial, gender, and socioeconomic biases that have skewed access to credit in the past.
  • Algorithmic credit scores might seem futuristic, but these practices do have roots in credit scoring practices of yore. Early credit agencies, for example, hired human reporters to dig into their customers’ credit histories. The reports were largely compiled from local gossip and colored by the speculations of the predominantly white, male middle class reporters. Remarks about race and class, asides about housekeeping, and speculations about sexual orientation all abounded.
  • By 1935, whole neighborhoods in the U.S. were classified according to their credit characteristics. A map from that year of Greater Atlanta comes color-coded in shades of blue (desirable), yellow (definitely declining) and red (hazardous). The legend recalls a time when an individual’s chances of receiving a mortgage were shaped by their geographic status.
  • ...1 more annotation...
  • These systems are fast becoming the norm. The Chinese Government is now close to launching its own algorithmic “Social Credit System” for its 1.4 billion citizens, a metric that uses online data to rate trustworthiness. As these systems become pervasive, and scores come to stand for individual worth, determining access to finance, services, and basic freedoms, the stakes of one bad decision are that much higher. This is to say nothing of the legitimacy of using such algorithmic proxies in the first place. While it might seem obvious to call for greater transparency in these systems, with machine learning and massive datasets it’s extremely difficult to locate bias. Even if we could peer inside the black box, we probably wouldn’t find a clause in the code instructing the system to discriminate against the poor, or people of color, or even people who play too many video games. More important than understanding how these scores get calculated is giving users meaningful opportunities to dispute and contest adverse decisions that are made about them by the algorithm.
Bill Fulkerson

The worst thing I read this year, and what it taught me… or Can we design soc... - 0 views

  •  
    "I'm going to teach a new course this fall, tentatively titled "Technology and Social Change". It's going to include an examination of the four levers of social change Larry Lessig suggests in Code and which I've been exploring as possible paths to civic engagement. It will include deep methodological dives into codesign, and into using anthropology as tool for understanding user needs. It will look at unintended consequences, cases where technology's best intentions fail, and cases where careful exploration and preparation led to technosocial systems that make users and communities more powerful than they were before."
1 - 20 of 419 Next › Last »
Showing 20 items per page