Skip to main content

Home/ Dystopias/ Group items tagged tech

Rss Feed Group items tagged

Ed Webb

Clear backpacks, monitored emails: life for US students under constant surveillance | E... - 0 views

  • This level of surveillance is “not too over-the-top”, Ingrid said, and she feels her classmates are generally “accepting” of it.
  • One leading student privacy expert estimated that as many as a third of America’s roughly 15,000 school districts may already be using technology that monitors students’ emails and documents for phrases that might flag suicidal thoughts, plans for a school shooting, or a range of other offenses.
  • When Dapier talks with other teen librarians about the issue of school surveillance, “we’re very alarmed,” he said. “It sort of trains the next generation that [surveillance] is normal, that it’s not an issue. What is the next generation’s Mark Zuckerberg going to think is normal?
  • ...13 more annotations...
  • Some parents said they were alarmed and frightened by schools’ new monitoring technologies. Others said they were conflicted, seeing some benefits to schools watching over what kids are doing online, but uncertain if their schools were striking the right balance with privacy concerns. Many said they were not even sure what kind of surveillance technology their schools might be using, and that the permission slips they had signed when their kids brought home school devices had told them almost nothing
  • “They’re so unclear that I’ve just decided to cut off the research completely, to not do any of it.”
  • As of 2018, at least 60 American school districts had also spent more than $1m on separate monitoring technology to track what their students were saying on public social media accounts, an amount that spiked sharply in the wake of the 2018 Parkland school shooting, according to the Brennan Center for Justice, a progressive advocacy group that compiled and analyzed school contracts with a subset of surveillance companies.
  • “They are all mandatory, and the accounts have been created before we’ve even been consulted,” he said. Parents are given almost no information about how their children’s data is being used, or the business models of the companies involved. Any time his kids complete school work through a digital platform, they are generating huge amounts of very personal, and potentially very valuable, data. The platforms know what time his kids do their homework, and whether it’s done early or at the last minute. They know what kinds of mistakes his kids make on math problems.
  • Felix, now 12, said he is frustrated that the school “doesn’t really [educate] students on what is OK and what is not OK. They don’t make it clear when they are tracking you, or not, or what platforms they track you on. “They don’t really give you a list of things not to do,” he said. “Once you’re in trouble, they act like you knew.”
  • “It’s the school as panopticon, and the sweeping searchlight beams into homes, now, and to me, that’s just disastrous to intellectual risk-taking and creativity.”
  • Many parents also said that they wanted more transparency and more parental control over surveillance. A few years ago, Ben, a tech professional from Maryland, got a call from his son’s principal to set up an urgent meeting. His son, then about nine or 10-years old, had opened up a school Google document and typed “I want to kill myself.” It was not until he and his son were in a serious meeting with school officials that Ben found out what happened: his son had typed the words on purpose, curious about what would happen. “The smile on his face gave away that he was testing boundaries, and not considering harming himself,” Ben said. (He asked that his last name and his son’s school district not be published, to preserve his son’s privacy.) The incident was resolved easily, he said, in part because Ben’s family already had close relationships with the school administrators.
  • there is still no independent evaluation of whether this kind of surveillance technology actually works to reduce violence and suicide.
  • Certain groups of students could easily be targeted by the monitoring more intensely than others, she said. Would Muslim students face additional surveillance? What about black students? Her daughter, who is 11, loves hip-hop music. “Maybe some of that language could be misconstrued, by the wrong ears or the wrong eyes, as potentially violent or threatening,” she said.
  • The Parent Coalition for Student Privacy was founded in 2014, in the wake of parental outrage over the attempt to create a standardized national database that would track hundreds of data points about public school students, from their names and social security numbers to their attendance, academic performance, and disciplinary and behavior records, and share the data with education tech companies. The effort, which had been funded by the Gates Foundation, collapsed in 2014 after fierce opposition from parents and privacy activists.
  • “More and more parents are organizing against the onslaught of ed tech and the loss of privacy that it entails. But at the same time, there’s so much money and power and political influence behind these groups,”
  • some privacy experts – and students – said they are concerned that surveillance at school might actually be undermining students’ wellbeing
  • “I do think the constant screen surveillance has affected our anxiety levels and our levels of depression.” “It’s over-guarding kids,” she said. “You need to let them make mistakes, you know? That’s kind of how we learn.”
Ed Webb

BBC News - Cult of less: Living out of a hard drive - 0 views

  • The DJ has now replaced his bed with friends' couches, paper bills with online banking, and a record collection containing nearly 2,000 albums with an external hard drive with DJ software and nearly 13,000 MP3s
    • Ed Webb
       
      MP3s are convenient, of course, but they don't sound even half as good as vinyl. Seriously.
  • Mr Klein says the lifestyle can become loathsome because "you never know where you will sleep". And Mr Yurista says he frequently worries he may lose his new digital life to a hard drive crash or downed server. "You have to really make sure you have back-ups of your digital goods everywhere," he said.
  • like a house fire that rips through a family's prized possessions, when someone loses their digital goods to a computer crash, they can be devastated. Kelly Chessen, a 36-year-old former suicide hotline counsellor with a soothing voice and reassuring personality, is Drive Savers official "data crisis counsellor". Part-psychiatrist and part-tech enthusiast, Ms Chessen's role is to try to calm people down when they lose their digital possessions to failed drives. Ms Chessen says some people have gone as far as to threaten suicide over their lost digital possessions and data. "It's usually indirect threats like, 'I'm not sure what I'm going to do if I can't get the data back,' but sometimes it will be a direct threat such as, 'I may just have to end it if I can't get to the information',"
  • ...4 more annotations...
  • Dr Sandberg believes we could be living on hard drives along with our digital possessions in the not too distant future, which would allow us to shed the trouble of owning a body. The concept is called "mind uploading", and it suggests that when our bodies age and begin to fail like a worn or snapped record, we may be able to continue living consciously inside a computer as our own virtual substitutes. "It's the idea that we can copy or transfer the information inside the brain into a form that can be run on the computer," said Dr Sandberg. He added: "That would mean that your consciousness or a combination of that would continue in the computer." Dr Sandberg says although it's just a theory now, researchers and engineers are working on super computers that could one day handle a map of all the networks of neurons and synapses in our brains - and that map could produce human consciousness outside of the body.
  • Mr Sutton is the founder of CultofLess.com, a website which has helped him sell or give away his possessions - apart from his laptop, an iPad, an Amazon Kindle, two external hard drives, a "few" articles of clothing and bed sheets for a mattress that was left in his newly rented apartment. This 21st-Century minimalist says he got rid of much of his clutter because he felt the ever-increasing number of available digital goods have provided adequate replacements for his former physical possessions
  • The tech-savvy Los Angeles "transplant" credits his external hard drives and online services like iTunes, Hulu, Flickr, Facebook, Skype and Google Maps for allowing him to lead a minimalist life.
  • - the internet has replaced my need for an address
Ed Webb

Are we raising a generation of nincompoops? - Boston.com - 3 views

  •  
    This is an example of a common and pernicious version of technophobia - the idea that overreliance on new tech disempowers and renders stupid a new generation. Does it have any credibility?
Ed Webb

Does the Digital Classroom Enfeeble the Mind? - NYTimes.com - 0 views

  • My father would have been unable to “teach to the test.” He once complained about errors in a sixth-grade math textbook, so he had the class learn math by designing a spaceship. My father would have been spat out by today’s test-driven educational regime.
  • A career in computer science makes you see the world in its terms. You start to see money as a form of information display instead of as a store of value. Money flows are the computational output of a lot of people planning, promising, evaluating, hedging and scheming, and those behaviors start to look like a set of algorithms. You start to see the weather as a computer processing bits tweaked by the sun, and gravity as a cosmic calculation that keeps events in time and space consistent. This way of seeing is becoming ever more common as people have experiences with computers. While it has its glorious moments, the computational perspective can at times be uniquely unromantic. Nothing kills music for me as much as having some algorithm calculate what music I will want to hear. That seems to miss the whole point. Inventing your musical taste is the point, isn’t it? Bringing computers into the middle of that is like paying someone to program a robot to have sex on your behalf so you don’t have to. And yet it seems we benefit from shining an objectifying digital light to disinfect our funky, lying selves once in a while. It’s heartless to have music chosen by digital algorithms. But at least there are fewer people held hostage to the tastes of bad radio D.J.’s than there once were. The trick is being ambidextrous, holding one hand to the heart while counting on the digits of the other.
  • The future of education in the digital age will be determined by our judgment of which aspects of the information we pass between generations can be represented in computers at all. If we try to represent something digitally when we actually can’t, we kill the romance and make some aspect of the human condition newly bland and absurd. If we romanticize information that shouldn’t be shielded from harsh calculations, we’ll suffer bad teachers and D.J.’s and their wares.
  • ...5 more annotations...
  • Some of the top digital designs of the moment, both in school and in the rest of life, embed the underlying message that we understand the brain and its workings. That is false. We don’t know how information is represented in the brain. We don’t know how reason is accomplished by neurons. There are some vaguely cool ideas floating around, and we might know a lot more about these things any moment now, but at this moment, we don’t. You could spend all day reading literature about educational technology without being reminded that this frontier of ignorance lies before us. We are tempted by the demons of commercial and professional ambition to pretend we know more than we do.
  • Outside school, something similar happens. Students spend a lot of time acting as trivialized relays in giant schemes designed for the purposes of advertising and other revenue-minded manipulations. They are prompted to create databases about themselves and then trust algorithms to assemble streams of songs and movies and stories for their consumption. We see the embedded philosophy bloom when students assemble papers as mash-ups from online snippets instead of thinking and composing on a blank piece of screen. What is wrong with this is not that students are any lazier now or learning less. (It is probably even true, I admit reluctantly, that in the presence of the ambient Internet, maybe it is not so important anymore to hold an archive of certain kinds of academic trivia in your head.) The problem is that students could come to conceive of themselves as relays in a transpersonal digital structure. Their job is then to copy and transfer data around, to be a source of statistics, whether to be processed by tests at school or by advertising schemes elsewhere.
  • If students don’t learn to think, then no amount of access to information will do them any good.
  • To the degree that education is about the transfer of the known between generations, it can be digitized, analyzed, optimized and bottled or posted on Twitter. To the degree that education is about the self-invention of the human race, the gargantuan process of steering billions of brains into unforeseeable states and configurations in the future, it can continue only if each brain learns to invent itself. And that is beyond computation because it is beyond our comprehension.
  • Roughly speaking, there are two ways to use computers in the classroom. You can have them measure and represent the students and the teachers, or you can have the class build a virtual spaceship. Right now the first way is ubiquitous, but the virtual spaceships are being built only by tenacious oddballs in unusual circumstances. More spaceships, please.
  •  
    How do we get this right - use the tech for what it can do well, develop our brains for what the tech can't do? Who's up for building a spaceship?
Ed Webb

We, The Technocrats - blprnt - Medium - 2 views

  • Silicon Valley’s go-to linguistic dodge: the collective we
  • “What kind of a world do we want to live in?”
  • Big tech’s collective we is its ‘all lives matter’, a way to soft-pedal concerns about privacy while refusing to speak directly to dangerous inequalities.
  • ...7 more annotations...
  • One two-letter word cannot possibly hold all of the varied experiences of data, specifically those of the people are at the most immediate risk: visible minorities, LGBTQ+ people, indigenous communities, the elderly, the disabled, displaced migrants, the incarcerated
  • At least twenty-six states allow the FBI to perform facial recognition searches against their databases of images from drivers licenses and state IDs, despite the fact that the FBI’s own reports have indicated that facial recognition is less accurate for black people. Black people, already at a higher risk of arrest and incarceration than other Americans, feel these data systems in a much different way than I do
  • last week, the Department of Justice passed a brief to the Supreme Court arguing that sex discrimination protections do not extend to transgender people. If this ruling were to be supported, it would immediately put trans women and men at more risk than others from the surveillant data technologies that are becoming more and more common in the workplace. Trans people will be put in distinct danger — a reality that is lost when they are folded neatly into a communal we
  • I looked at the list of speakers for the conference in Brussels to get an idea of the particular we of Cook’s audience, which included Mark Zuckerberg, Google’s CEO Sundar Pichai and the King of Spain. Of the presenters, 57% were men and 83% where white. Only 4 of the 132 people on stage were black.
  • another we that Tim Cook necessarily speaks on the behalf of: privileged men in tech. This we includes Mark and Sundar; it includes 60% of Silicon Valley and 91% of its equity. It is this we who have reaped the most benefit from Big Data and carried the least risk, all while occupying the most time on stage
  • Here’s a more urgent question for us, one that doesn’t ask what we want but instead what they need:How can this new data world be made safer for the people who are facing real risks, right now?
  • “The act of listening has greater ethical potential than speaking” — Julietta Singh
Ed Webb

Where is the boundary between your phone and your mind? | US news | The Guardian - 1 views

  • Here’s a thought experiment: where do you end? Not your body, but you, the nebulous identity you think of as your “self”. Does it end at the limits of your physical form? Or does it include your voice, which can now be heard as far as outer space; your personal and behavioral data, which is spread out across the impossibly broad plane known as digital space; and your active online personas, which probably encompass dozens of different social media networks, text message conversations, and email exchanges? This is a question with no clear answer, and, as the smartphone grows ever more essential to our daily lives, that border’s only getting blurrier.
  • our minds have become even more radically extended than ever before
  • one of the essential differences between a smartphone and a piece of paper, which is that our relationship with our phones is reciprocal: we not only put information into the device, we also receive information from it, and, in that sense, it shapes our lives far more actively than would, say, a shopping list. The shopping list isn’t suggesting to us, based on algorithmic responses to our past and current shopping behavior, what we should buy; the phone is
  • ...10 more annotations...
  • American consumers spent five hours per day on their mobile devices, and showed a dizzying 69% year-over-year increase in time spent in apps like Facebook, Twitter, and YouTube. The prevalence of apps represents a concrete example of the movement away from the old notion of accessing the Internet through a browser and the new reality of the connected world and its myriad elements – news, social media, entertainment – being with us all the time
  • “In the 90s and even through the early 2000s, for many people, there was this way of thinking about cyberspace as a space that was somewhere else: it was in your computer. You went to your desktop to get there,” Weigel says. “One of the biggest shifts that’s happened and that will continue to happen is the undoing of a border that we used to perceive between the virtual and the physical world.”
  • While many of us think of the smartphone as a portal for accessing the outside world, the reciprocity of the device, as well as the larger pattern of our behavior online, means the portal goes the other way as well: it’s a means for others to access us
  • Weigel sees the unfettered access to our data, through our smartphone and browser use, of what she calls the big five tech companies – Apple, Alphabet (the parent company of Google), Microsoft, Facebook, and Amazon – as a legitimate problem for notions of democracy
  • an unfathomable amount of wealth, power, and direct influence on the consumer in the hands of just a few individuals – individuals who can affect billions of lives with a tweak in the code of their products
  • “This is where the fundamental democracy deficit comes from: you have this incredibly concentrated private power with zero transparency or democratic oversight or accountability, and then they have this unprecedented wealth of data about their users to work with,”
  • the rhetoric around the Internet was that the crowd would prevent the spread of misinformation, filtering it out like a great big hive mind; it would also help to prevent the spread of things like hate speech. Obviously, this has not been the case, and even the relatively successful experiments in this, such as Wikipedia, have a great deal of human governance that allows them to function properly
  • We should know and be aware of how these companies work, how they track our behavior, and how they make recommendations to us based on our behavior and that of others. Essentially, we need to understand the fundamental difference between our behavior IRL and in the digital sphere – a difference that, despite the erosion of boundaries, still stands
  • “Whether we know it or not, the connections that we make on the Internet are being used to cultivate an identity for us – an identity that is then sold to us afterward,” Lynch says. “Google tells you what questions to ask, and then it gives you the answers to those questions.”
  • It isn’t enough that the apps in our phone flatten all of the different categories of relationships we have into one broad group: friends, followers, connections. They go one step further than that. “You’re being told who you are all the time by Facebook and social media because which posts are coming up from your friends are due to an algorithm that is trying to get you to pay more attention to Facebook,” Lynch says. “That’s affecting our identity, because it affects who you think your friends are, because they’re the ones who are popping up higher on your feed.”
Ed Webb

Wearing a mask won't stop facial recognition anymore - The coronavirus is prompting fac... - 0 views

  • expanding this system to a wider group of people would be hard. When a population reaches a certain scale, the system is likely to encounter people with similar eyes.This might be why most commercial facial recognition systems that can identify masked faces seem limited to small-scale applications
  • Many residential communities, especially in areas hit hardest by the virus, have been limiting entry to residents only. Minivision introduced the new algorithm to its facial recognition gate lock systems in communities in Nanjing to quickly recognize residents without the need to take off masks.
  • SenseTime, which announced the rollout of its face mask-busting tech last week, explained that its algorithm is designed to read 240 facial feature key points around the eyes, mouth and nose. It can make a match using just the parts of the face that are visible.
  • ...1 more annotation...
  • New forms of facial recognition can now recognize not just people wearing masks over their mouths, but also people in scarves and even with fake beards. And the technology is already rolling out in China because of one unexpected event: The coronavirus outbreak.
Ed Webb

At age 13, I joined the alt-right, aided by Reddit and Google - 0 views

  • Now, I’m 16, and I’ve been able to reflect on how I got sucked into that void—and how others do, too. My brief infatuation with the alt-right has helped me understand the ways big tech companies and their algorithms are contributing to the problem of radicalization—and why it’s so important to be skeptical of what you read online.
  • while a quick burst of radiation probably won’t give you cancer, prolonged exposure is far more dangerous. The same is true for the alt-right. I knew that the messages I was seeing were wrong, but the more I saw them, the more curious I became. I was unfamiliar with most of the popular discussion topics on Reddit. And when you want to know more about something, what do you do? You probably don’t think to go to the library and check out a book on that subject, and then fact check and cross reference what you find. If you just google what you want to know, you can get the information you want within seconds.
  • I started googling things like “Illegal immigration,” “Sandy Hook actors,” and “Black crime rate.” And I found exactly what I was looking for.
  • ...11 more annotations...
  • The articles and videos I first found all backed up what I was seeing on Reddit—posts that asserted a skewed version of actual reality, using carefully selected, out-of-context, and dubiously sourced statistics that propped up a hateful world view. On top of that, my online results were heavily influenced by something called an algorithm. I understand algorithms to be secretive bits of code that a website like YouTube will use to prioritize content that you are more likely to click on first. Because all of the content I was reading or watching was from far-right sources, all of the links that the algorithms dangled on my screen for me to click were from far-right perspectives.
  • I spent months isolated in my room, hunched over my computer, removing and approving memes on Reddit and watching conservative “comedians” that YouTube served up to me.
  • The inflammatory language and radical viewpoints used by the alt-right worked to YouTube and Google’s favor—the more videos and links I clicked on, the more ads I saw, and in turn, the more ad revenue they generated.
  • the biggest step in my recovery came when I attended a pro-Trump rally in Washington, D.C., in September 2017, about a month after the “Unite the Right” rally in Charlottesville, Virginia
  • The difference between the online persona of someone who identifies as alt-right and the real thing is so extreme that you would think they are different people. Online, they have the power of fake and biased news to form their arguments. They sound confident and usually deliver their standard messages strongly. When I met them in person at the rally, they were awkward and struggled to back up their statements. They tripped over their own words, and when they were called out by any counter protestors in the crowd, they would immediately use a stock response such as “You’re just triggered.”
  • Seeing for myself that the people I was talking to online were weak, confused, and backwards was the turning point for me.
  • we’re too far gone to reverse the damage that the alt-right has done to the internet and to naive adolescents who don’t know any better—children like the 13-year-old boy I was. It’s convenient for a massive internet company like Google to deliberately ignore why people like me get misinformed in the first place, as their profit-oriented algorithms continue to steer ignorant, malleable people into the jaws of the far-right
  • Dylann Roof, the white supremacist who murdered nine people in a Charleston, South Carolina, church in 2015, was radicalized by far-right groups that spread misinformation with the aid of Google’s algorithms.
  • Over the past couple months, I’ve been getting anti-immigration YouTube ads that feature an incident presented as a “news” story, about two immigrants who raped an American girl. The ad offers no context or sources, and uses heated language to denounce immigration and call for our county to allow ICE to seek out illegal immigrants within our area. I wasn’t watching a video about immigration or even politics when those ads came on; I was watching the old Monty Python “Cheese Shop” sketch. How does British satire, circa 1972, relate to America’s current immigration debate? It doesn’t.
  • tech companies need to be held accountable for the radicalization that results from their systems and standards.
  • anyone can be manipulated like I was. It’s so easy to find information online that we collectively forget that so much of the content the internet offers us is biased
Ed Webb

How the U.S. Military Buys Location Data from Ordinary Apps - 0 views

  • The U.S. military is buying the granular movement data of people around the world, harvested from innocuous-seeming apps, Motherboard has learned. The most popular app among a group Motherboard analyzed connected to this sort of data sale is a Muslim prayer and Quran app that has more than 98 million downloads worldwide. Others include a Muslim dating app, a popular Craigslist app, an app for following storms, and a "level" app that can be used to help, for example, install shelves in a bedroom.
  • The Locate X data itself is anonymized, but the source said "we could absolutely deanonymize a person." Babel Street employees would "play with it, to be honest,"
  • "Our access to the software is used to support Special Operations Forces mission requirements overseas. We strictly adhere to established procedures and policies for protecting the privacy, civil liberties, constitutional and legal rights of American citizens."
  • ...7 more annotations...
  • In March, tech publication Protocol first reported that U.S. law enforcement agencies such as Customs and Border Protection (CBP) and Immigration and Customs Enforcement (ICE) were using Locate X. Motherboard then obtained an internal Secret Service document confirming the agency's use of the technology. Some government agencies, including CBP and the Internal Revenue Service (IRS), have also purchased access to location data from another vendor called Venntel.
  • the company tracks 25 million devices inside the United States every month, and 40 million elsewhere, including in the European Union, Latin America, and the Asia-Pacific region
  • Motherboard found another network of dating apps that look and operate nearly identically to Mingle, including sending location data to X-Mode. Motherboard installed another dating app, called Iran Social, on a test device and observed GPS coordinates being sent to the company. The network of apps also includes Turkey Social, Egypt Social, Colombia Social, and others focused on particular countries.
  • Senator Ron Wyden told Motherboard in a statement that X-Mode said it is selling location data harvested from U.S. phones to U.S. military customers."In a September call with my office, lawyers for the data broker X-Mode Social confirmed that the company is selling data collected from phones in the United States to U.S. military customers, via defense contractors. Citing non-disclosure agreements, the company refused to identify the specific defense contractors or the specific government agencies buying the data,"
  • some apps that are harvesting location data on behalf of X-Mode are essentially hiding the data transfer. Muslim Pro does not mention X-Mode in its privacy policy, and did not provide any sort of pop-up when installing or opening the app that explained the transfer of location data in detail. The privacy policy does say Muslim Pro works with Tutela and Quadrant, two other location data companies, however. Motherboard did observe data transfer to Tutela.
  • The Muslim Mingle app provided no pop-up disclosure in Motherboard's tests, nor does the app's privacy policy mention X-Mode at all. Iran Social, one of the apps in the second network of dating apps that used much of the same code, also had the same lack of disclosures around the sale of location data.
  • "The question to ask is whether a reasonable consumer of these services would foresee of these uses and agree to them if explicitly asked. It is safe to say from this context that the reasonable consumer—who is not a tech person—would not have military uses of their data in mind, even if they read the disclosures."
Ed Webb

I unintentionally created a biased AI algorithm 25 years ago - tech companies are still... - 0 views

  • How and why do well-educated, well-intentioned scientists produce biased AI systems? Sociological theories of privilege provide one useful lens.
  • Their training data is biased. They are designed by an unrepresentative group. They face the mathematical impossibility of treating all categories equally. They must somehow trade accuracy for fairness. And their biases are hiding behind millions of inscrutable numerical parameters.
  • fairness can still be the victim of competitive pressures in academia and industry. The flawed Bard and Bing chatbots from Google and Microsoft are recent evidence of this grim reality. The commercial necessity of building market share led to the premature release of these systems.
  • ...3 more annotations...
  • Scientists also face a nasty subconscious dilemma when incorporating diversity into machine learning models: Diverse, inclusive models perform worse than narrow models.
  • biased AI systems can still be created unintentionally and easily. It’s also clear that the bias in these systems can be harmful, hard to detect and even harder to eliminate.
  • with North American computer science doctoral programs graduating only about 23% female, and 3% Black and Latino students, there will continue to be many rooms and many algorithms in which underrepresented groups are not represented at all.
Ed Webb

Drone warfare's deadly civilian toll: a very personal view | James Jeffrey | Comment is... - 0 views

  • Both Pakistan and Yemen are arguably less stable and more hostile to the west as a result of President Obama's increased reliance on drones. When surveying the poisoned legacy left to the Iraqi people, and what will be left to the Afghan people, it's beyond depressing to hear of the hawks circling around other theatres like Pakistan and Yemen, stoking the flames of interventionism.I fear the folly in which I took part will never end, and society will be irreversibly enmeshed in what George Orwell's 1984 warned of: constant wars against the Other, in order to forge false unity and fealty to the state.
  • in Afghanistan, the linguistic corruption that always attends war meant we'd refer to "hot spots", "multiple pax on the ground" and "prosecuting a target", or "maximising the kill chain".
  • encroachment of drones into the civilian realm is also gaining momentum. President Obama signed a federal law on 14 February 2012, allowing drones for a variety of commercial uses and for police law enforcement. The skies above may never be the same. As with most of America's darker elements, such as its gun culture, there's profit to be made – the market for drones is already valued at $5.9bn and is expected to double in 10 years.
  • ...1 more annotation...
  • Technological advancements in warfare don't have a good track record in terms of unintended consequences
Ed Webb

Internet Privacy: Why Library of Congress Twitter Archives Could be a Bad Thing | Geek ... - 0 views

  •  
    Obama
Ed Webb

Artificial meat? Food for thought by 2050 | Environment | The Guardian - 0 views

  • even with new technologies such as genetic modification and nanotechnology, hundreds of millions of people may still go hungry owing to a combination of climate change, water shortages and increasing food consumption.
  • Many low-tech ways are considered to effectively increase yields, such as reducing the 30-40% food waste that occurs both in rich and poor countries. If developing countries had better storage facilities and supermarkets and consumers in rich countries bought only what they needed, there would be far more food available.
  • wo "wild cards" could transform global meat and milk production. "One is artificial meat, which is made in a giant vat, and the other is nanotechnology, which is expected to become more important as a vehicle for delivering medication to livestock."
  • ...4 more annotations...
  • One of the gloomiest assessments comes from a team of British and South African economists who say that a vast effort must be made in agricultural research to create a new green revolution, but that seven multinational corporations, led by Monsanto, now dominate the global technology field.
  • a threat to the global commons in agricultural technology on which the green revolution has depended
  • Up to 70% of the energy needed to grow and supply food at present is fossil-fuel based which in turn contributes to climate change
  • The 21 papers published today in a special open access edition of the philosophical transactions of the royalsociety.org are part of a UK government Foresight study on the future of the global food industry. The final report will be published later this year in advance of the UN climate talks in Cancun, Mexico.
Ed Webb

Could self-aware cities be the first forms of artificial intelligence? - 1 views

  • People have speculated before about the idea that the Internet might become self-aware and turn into the first "real" A.I., but could it be more likely to happen to cities, in which humans actually live and work and navigate, generating an even more chaotic system?
  • "By connecting and providing visibility into disparate systems, cities and buildings can operate like living organisms, sensing and responding quickly to potential problems before they occur to protect citizens, save resources and reduce energy consumption and carbon emissions," reads the invitation to IBM's PULSE 2010 event.
  • And Cisco is already building the first of these smart cities: Songdo, a Korean "instant city," which will be completely controlled by computer networks — including ubiquitious Telepresence applications, video screens which could be used for surveillance. Cisco's chief globalization officer, Wim Elfrink, told the San Jose Mercury News: Everything will be connected - buildings, cars, energy - everything. This is the tipping point. When we start building cities with technology in the infrastructure, it's beyond my imagination what that will enable.
  • ...9 more annotations...
  • Urbanscale founder Adam Greenfield has written a lot about ubiquitous computing in urban environments, most notably in 2006's Everyware, which posits that computers will "effectively disappear" as objects around us become "smart" in ways that are nearly invisible to lay-people.
  • tailored advertising just about anywhere
  • Some futurists are still predicting that cities will become closer to arcologies — huge slabs of integrated urban life, like a whole city in a single block — as they grapple with the need to house so many people in an efficient fashion. The implications for heating and cooling an arcology, let alone dealing with waste disposal, are mind-boggling. Could a future arcology become our first machine mind?
  • Science fiction gives us the occasional virtual worlds that look rural — like Doctor Who's visions of life inside the Matrix, which mostly looks (not surprisingly) like a gravel quarry — but for the most part, virtual worlds are always urban
  • So here's why cities might have an edge over, say, the Internet as a whole, when it comes to developing self awareness. Because every city is different, and every city has its own identity and sense of self — and this informs everything from urban planning to the ways in which parking and electricity use are mapped out. The more sophisticated the integrated systems associated with a city become, the more they'll reflect the city's unique personality, and the more programmers will try to imbue their computers with a sense of this unique urban identity. And a sense of the city's history, and the ways in which the city has evolved and grown, will be important for a more sophisticated urban planning system to grasp the future — so it's very possible to imagine this leading to a sense of personal history, on the part of a computer that identifies with the city it helps to manage.
  • next time you're wandering around your city, looking up at the outcroppings of huge buildings, the wild tides of traffic and the frenzy of construction and demolition, don't just think of it as a place haunted by history. Try, instead, to imagine it coming to life in a new way, opening its millions of electronic eyes, and greeting you with the first gleaming of independent thought
  • I can't wait for the day when city AI's decide to go to war with other city AI's over allocation of federal funds.
  • John Shirley has San Fransisco as a sentient being in City Come A Walkin
  • I doubt cities will ever be networked so smoothly... they are all about fractions, sections, niches, subcultures, ethicities, neighborhoods, markets, underground markets. It's literally like herding cats... I don't see it as feasible. It would be a schizophrenic intelligence at best. Which, Wintermute was I suppose...
  •  
    This is beginning to sound just like the cities we have read about. To me it sort of reminds me of the Burning chrome stories, as an element in all those stories was machines and technology at every turn. With the recent advances is technology it is alarming to see that an element in many science fiction tales is finally coming true. A city that acts as a machine in its self. Who is to say that this city won't become a city with a highly active hacker underbelly.
1 - 20 of 35 Next ›
Showing 20 items per page