Skip to main content

Home/ Dystopias/ Group items tagged universities

Rss Feed Group items tagged

Ed Webb

Our Digitally Undying Memories - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • as Viktor Mayer-Schönberger argues convincingly in his book Delete: The Virtue of Forgetting in the Digital Age (Princeton University Press, 2009), the costs of such powerful collective memory are often higher than we assume.
  • "Total recall" renders context, time, and distance irrelevant. Something that happened 40 years ago—whether youthful or scholarly indiscretion—still matters and can come back to harm us as if it had happened yesterday.
  • an important "third wave" of work about the digital environment. In the late 1990s and early 2000s, we saw books like Nicholas Negroponte's Being Digital (Knopf, 1995) and Howard Rhein-gold's The Virtual Community: Homesteading on the Electronic Frontier (Addison-Wesley, 1993) and Smart Mobs: The Next Social Revolution (Perseus, 2002), which idealistically described the transformative powers of digital networks. Then we saw shallow blowback, exemplified by Susan Jacoby's The Age of American Unreason (Pantheon, 2008).
  • ...14 more annotations...
  • For most of human history, forgetting was the default and remembering the challenge.
  • Chants, songs, monasteries, books, libraries, and even universities were established primarily to overcome our propensity to forget over time. The physical and economic limitations of all of those technologies and institutions served us well. Each acted not just as memory aids but also as filters or editors. They helped us remember much by helping us discard even more.
    • Ed Webb
       
      Excellent point, well made.
  • Just because we have the vessels, we fill them.
  • Even 10 years ago, we did not consider that words written for a tiny audience could reach beyond, perhaps to someone unforgiving, uninitiated in a community, or just plain unkind.
  • Remembering to forget, as Elvis argued, is also essential to getting over heartbreak. And, as Jorge Luis Borges wrote in his 1942 (yep, I Googled it to find the date) story "Funes el memorioso," it is just as important to the act of thinking. Funes, the young man in the story afflicted with an inability to forget anything, can't make sense of it. He can't think abstractly. He can't judge facts by relative weight or seriousness. He is lost in the details. Painfully, Funes cannot rest.
  • Our use of the proliferating data and rudimentary filters in our lives renders us incapable of judging, discriminating, or engaging in deductive reasoning. And inductive reasoning, which one could argue is entering a golden age with the rise of huge databases and the processing power needed to detect patterns and anomalies, is beyond the reach of lay users of the grand collective database called the Internet.
  • the default habits of our species: to record, retain, and release as much information as possible
  • Perhaps we just have to learn to manage wisely how we digest, discuss, and publicly assess the huge archive we are building. We must engender cultural habits that ensure perspective, calm deliberation, and wisdom. That's hard work.
  • we choose the nature of technologies. They don't choose us. We just happen to choose unwisely with some frequency
  • surveillance as the chief function of electronic government
  • critical information studies
  • Siva Vaidhyanathan is an associate professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything, is forthcoming from the University of California Press.
  • Nietzsche's _On the Use and Disadvantage of History for Life_
  • Google compresses, if not eliminates, temporal context. This is likely only to exacerbate the existing problem in politics of taking one's statements out of context. A politician whose views on a subject have evolved quite logically over decades in light of changing knowledge and/or circumstances is held up in attack ads as a flip-flopper because consecutive Google entries have him/her saying two opposite things about the same subject -- and never mind that between the two statements, the Berlin Wall may have fallen or the economy crashed harder than at any other time since 1929.
Ed Webb

Skipping Class? Sensors Are Watching - Wired Campus - The Chronicle of Higher Education - 0 views

  • Students at Northern Arizona University who hope to skip large lecture courses may have more trouble doing so this fall: The university is installing an electronic system that measures student attendance. The university is using $75,000 in federal stimulus money to install the system, which will detect the ID cards students are carrying as they enter large classrooms. (The cards can be read by an electronic sensor.) Faculty members can choose to receive electronic attendance reports.
  • the Facebook group "NAU Against Proximity Cards," which has over 1,300 members.
  • In light of last week's vote by its legislature, I think the Arizona schools should go a step further and require passports and birth certificates to enter classrooms.
Ed Webb

The Web Means the End of Forgetting - NYTimes.com - 1 views

  • for a great many people, the permanent memory bank of the Web increasingly means there are no second chances — no opportunities to escape a scarlet letter in your digital past. Now the worst thing you’ve done is often the first thing everyone knows about you.
  • a collective identity crisis. For most of human history, the idea of reinventing yourself or freely shaping your identity — of presenting different selves in different contexts (at home, at work, at play) — was hard to fathom, because people’s identities were fixed by their roles in a rigid social hierarchy. With little geographic or social mobility, you were defined not as an individual but by your village, your class, your job or your guild. But that started to change in the late Middle Ages and the Renaissance, with a growing individualism that came to redefine human identity. As people perceived themselves increasingly as individuals, their status became a function not of inherited categories but of their own efforts and achievements. This new conception of malleable and fluid identity found its fullest and purest expression in the American ideal of the self-made man, a term popularized by Henry Clay in 1832.
  • the dawning of the Internet age promised to resurrect the ideal of what the psychiatrist Robert Jay Lifton has called the “protean self.” If you couldn’t flee to Texas, you could always seek out a new chat room and create a new screen name. For some technology enthusiasts, the Web was supposed to be the second flowering of the open frontier, and the ability to segment our identities with an endless supply of pseudonyms, avatars and categories of friendship was supposed to let people present different sides of their personalities in different contexts. What seemed within our grasp was a power that only Proteus possessed: namely, perfect control over our shifting identities. But the hope that we could carefully control how others view us in different contexts has proved to be another myth. As social-networking sites expanded, it was no longer quite so easy to have segmented identities: now that so many people use a single platform to post constant status updates and photos about their private and public activities, the idea of a home self, a work self, a family self and a high-school-friends self has become increasingly untenable. In fact, the attempt to maintain different selves often arouses suspicion.
  • ...20 more annotations...
  • All around the world, political leaders, scholars and citizens are searching for responses to the challenge of preserving control of our identities in a digital world that never forgets. Are the most promising solutions going to be technological? Legislative? Judicial? Ethical? A result of shifting social norms and cultural expectations? Or some mix of the above?
  • These approaches share the common goal of reconstructing a form of control over our identities: the ability to reinvent ourselves, to escape our pasts and to improve the selves that we present to the world.
  • many technological theorists assumed that self-governing communities could ensure, through the self-correcting wisdom of the crowd, that all participants enjoyed the online identities they deserved. Wikipedia is one embodiment of the faith that the wisdom of the crowd can correct most mistakes — that a Wikipedia entry for a small-town mayor, for example, will reflect the reputation he deserves. And if the crowd fails — perhaps by turning into a digital mob — Wikipedia offers other forms of redress
  • In practice, however, self-governing communities like Wikipedia — or algorithmically self-correcting systems like Google — often leave people feeling misrepresented and burned. Those who think that their online reputations have been unfairly tarnished by an isolated incident or two now have a practical option: consulting a firm like ReputationDefender, which promises to clean up your online image. ReputationDefender was founded by Michael Fertik, a Harvard Law School graduate who was troubled by the idea of young people being forever tainted online by their youthful indiscretions. “I was seeing articles about the ‘Lord of the Flies’ behavior that all of us engage in at that age,” he told me, “and it felt un-American that when the conduct was online, it could have permanent effects on the speaker and the victim. The right to new beginnings and the right to self-definition have always been among the most beautiful American ideals.”
  • In the Web 3.0 world, Fertik predicts, people will be rated, assessed and scored based not on their creditworthiness but on their trustworthiness as good parents, good dates, good employees, good baby sitters or good insurance risks.
  • “Our customers include parents whose kids have talked about them on the Internet — ‘Mom didn’t get the raise’; ‘Dad got fired’; ‘Mom and Dad are fighting a lot, and I’m worried they’ll get a divorce.’ ”
  • as facial-recognition technology becomes more widespread and sophisticated, it will almost certainly challenge our expectation of anonymity in public
  • Ohm says he worries that employers would be able to use social-network-aggregator services to identify people’s book and movie preferences and even Internet-search terms, and then fire or refuse to hire them on that basis. A handful of states — including New York, California, Colorado and North Dakota — broadly prohibit employers from discriminating against employees for legal off-duty conduct like smoking. Ohm suggests that these laws could be extended to prevent certain categories of employers from refusing to hire people based on Facebook pictures, status updates and other legal but embarrassing personal information. (In practice, these laws might be hard to enforce, since employers might not disclose the real reason for their hiring decisions, so employers, like credit-reporting agents, might also be required by law to disclose to job candidates the negative information in their digital files.)
  • research group’s preliminary results suggest that if rumors spread about something good you did 10 years ago, like winning a prize, they will be discounted; but if rumors spread about something bad that you did 10 years ago, like driving drunk, that information has staying power
  • many people aren’t worried about false information posted by others — they’re worried about true information they’ve posted about themselves when it is taken out of context or given undue weight. And defamation law doesn’t apply to true information or statements of opinion. Some legal scholars want to expand the ability to sue over true but embarrassing violations of privacy — although it appears to be a quixotic goal.
  • Researchers at the University of Washington, for example, are developing a technology called Vanish that makes electronic data “self-destruct” after a specified period of time. Instead of relying on Google, Facebook or Hotmail to delete the data that is stored “in the cloud” — in other words, on their distributed servers — Vanish encrypts the data and then “shatters” the encryption key. To read the data, your computer has to put the pieces of the key back together, but they “erode” or “rust” as time passes, and after a certain point the document can no longer be read.
  • Plenty of anecdotal evidence suggests that young people, having been burned by Facebook (and frustrated by its privacy policy, which at more than 5,000 words is longer than the U.S. Constitution), are savvier than older users about cleaning up their tagged photos and being careful about what they post.
  • norms are already developing to recreate off-the-record spaces in public, with no photos, Twitter posts or blogging allowed. Milk and Honey, an exclusive bar on Manhattan’s Lower East Side, requires potential members to sign an agreement promising not to blog about the bar’s goings on or to post photos on social-networking sites, and other bars and nightclubs are adopting similar policies. I’ve been at dinners recently where someone has requested, in all seriousness, “Please don’t tweet this” — a custom that is likely to spread.
  • There’s already a sharp rise in lawsuits known as Twittergation — that is, suits to force Web sites to remove slanderous or false posts.
  • strategies of “soft paternalism” that might nudge people to hesitate before posting, say, drunken photos from Cancún. “We could easily think about a system, when you are uploading certain photos, that immediately detects how sensitive the photo will be.”
  • It’s sobering, now that we live in a world misleadingly called a “global village,” to think about privacy in actual, small villages long ago. In the villages described in the Babylonian Talmud, for example, any kind of gossip or tale-bearing about other people — oral or written, true or false, friendly or mean — was considered a terrible sin because small communities have long memories and every word spoken about other people was thought to ascend to the heavenly cloud. (The digital cloud has made this metaphor literal.) But the Talmudic villages were, in fact, far more humane and forgiving than our brutal global village, where much of the content on the Internet would meet the Talmudic definition of gossip: although the Talmudic sages believed that God reads our thoughts and records them in the book of life, they also believed that God erases the book for those who atone for their sins by asking forgiveness of those they have wronged. In the Talmud, people have an obligation not to remind others of their past misdeeds, on the assumption they may have atoned and grown spiritually from their mistakes. “If a man was a repentant [sinner],” the Talmud says, “one must not say to him, ‘Remember your former deeds.’ ” Unlike God, however, the digital cloud rarely wipes our slates clean, and the keepers of the cloud today are sometimes less forgiving than their all-powerful divine predecessor.
  • On the Internet, it turns out, we’re not entitled to demand any particular respect at all, and if others don’t have the empathy necessary to forgive our missteps, or the attention spans necessary to judge us in context, there’s nothing we can do about it.
  • Gosling is optimistic about the implications of his study for the possibility of digital forgiveness. He acknowledged that social technologies are forcing us to merge identities that used to be separate — we can no longer have segmented selves like “a home or family self, a friend self, a leisure self, a work self.” But although he told Facebook, “I have to find a way to reconcile my professor self with my having-a-few-drinks self,” he also suggested that as all of us have to merge our public and private identities, photos showing us having a few drinks on Facebook will no longer seem so scandalous. “You see your accountant going out on weekends and attending clown conventions, that no longer makes you think that he’s not a good accountant. We’re coming to terms and reconciling with that merging of identities.”
  • a humane society values privacy, because it allows people to cultivate different aspects of their personalities in different contexts; and at the moment, the enforced merging of identities that used to be separate is leaving many casualties in its wake.
  • we need to learn new forms of empathy, new ways of defining ourselves without reference to what others say about us and new ways of forgiving one another for the digital trails that will follow us forever
Ed Webb

Cambridge University to open 'Terminator centre' to study threat to humans from artific... - 0 views

  • the four greatest threats to the human species - artificial intelligence, climate change, nuclear war and rogue biotechnology.
  • Huw Price, Bertrand Russell Professor of Philosophy and another of the centre's three founders, said such an 'ultra-intelligent machine, or artificial general intelligence (AGI)' could have very serious consequences. He said: 'Nature didn’t anticipate us, and we in our turn shouldn’t take AGI for granted.'We need to take seriously the possibility that there might be a ‘Pandora’s box’ moment with AGI that, if missed, could be disastrous.
Ed Webb

Goodbye petabytes, hello zettabytes | Technology | The Guardian - 0 views

  • Every man, woman and child on the planet using micro-blogging site Twitter for a century. For many people that may sound like a vision of hell, but for watchers of the tremendous growth of digital communications it is a neat way of presenting the sheer scale of the so-called digital universe.
  • Mobile phones have dramatically widened the range of people who can create, store and share digital information."China now has more visible devices out on the streets being used by individuals than the US does," said McDonald. "We are seeing the democratisation and commoditisation of the use and creation of information."
  • experts estimate that all human language used since the dawn of time would take up about 5,000 petabytes if stored in digital form, which is less than 1% of the digital content created since someone first switched on a computer.
  • ...6 more annotations...
  • A zettabyte, incidentally, is roughly half a million times the entire collections of all the academic libraries in the United States.
  • the growing desire of corporations and governments to know and store ever more data about everyone
  • About 70% of the digital universe is generated by individuals, but its storage is then predominantly the job of corporations. From emails and blogs to mobile phone calls, it is corporations that are storing information on behalf of consumers.
  • actions in the offline world that individuals carry out which result in digital content being created by organisations – from cashpoint transactions which a bank must record to walking along the pavement, which is likely to result in CCTV footage
  • "unstructured"
  • "You talk to a kid these days and they have no idea what a kilobyte is. The speed things progress, we are going to need many words beyond zettabyte."
Ed Webb

if you had a cyberpunk universe where th... - 0 views

if you had a cyberpunk universe where the richest man in the world is posting adds for a shell company called "ambrosia" that gathers blood from poor people to extend his lifespan you would rightly...

twitter

started by Ed Webb on 15 Sep 18 no follow-up yet
Ed Webb

Border Patrol, Israel's Elbit Put Reservation Under Surveillance - 0 views

  • The vehicle is parked where U.S. Customs and Border Protection will soon construct a 160-foot surveillance tower capable of continuously monitoring every person and vehicle within a radius of up to 7.5 miles. The tower will be outfitted with high-definition cameras with night vision, thermal sensors, and ground-sweeping radar, all of which will feed real-time data to Border Patrol agents at a central operating station in Ajo, Arizona. The system will store an archive with the ability to rewind and track individuals’ movements across time — an ability known as “wide-area persistent surveillance.” CBP plans 10 of these towers across the Tohono O’odham reservation, which spans an area roughly the size of Connecticut. Two will be located near residential areas, including Rivas’s neighborhood, which is home to about 50 people. To build them, CBP has entered a $26 million contract with the U.S. division of Elbit Systems, Israel’s largest military company.
  • U.S. borderlands have become laboratories for new systems of enforcement and control
  • these same systems often end up targeting other marginalized populations as well as political dissidents
  • ...16 more annotations...
  • the spread of persistent surveillance technologies is particularly worrisome because they remove any limit on how much information police can gather on a person’s movements. “The border is the natural place for the government to start using them, since there is much more public support to deploy these sorts of intrusive technologies there,”
  • the company’s ultimate goal is to build a “layer” of electronic surveillance equipment across the entire perimeter of the U.S. “Over time, we’ll expand not only to the northern border, but to the ports and harbors across the country,”
  • In addition to fixed and mobile surveillance towers, other technology that CBP has acquired and deployed includes blimps outfitted with high-powered ground and air radar, sensors buried underground, and facial recognition software at ports of entry. CBP’s drone fleet has been described as the largest of any U.S. agency outside the Department of Defense
  • Nellie Jo David, a Tohono O’odham tribal member who is writing her dissertation on border security issues at the University of Arizona, says many younger people who have been forced by economic circumstances to work in nearby cities are returning home less and less, because they want to avoid the constant surveillance and harassment. “It’s especially taken a toll on our younger generations.”
  • Border militarism has been spreading worldwide owing to neoliberal economic policies, wars, and the onset of the climate crisis, all of which have contributed to the uprooting of increasingly large numbers of people, notes Reece Jones
  • In the U.S., leading companies with border security contracts include long-established contractors such as Lockheed Martin in addition to recent upstarts such as Anduril Industries, founded by tech mogul Palmer Luckey to feed the growing market for artificial intelligence and surveillance sensors — primarily in the borderlands. Elbit Systems has frequently touted a major advantage over these competitors: the fact that its products are “field-proven” on Palestinians
  • Verlon Jose, then-tribal vice chair, said that many nation members calculated that the towers would help dissuade the federal government from building a border wall across their lands. The Tohono O’odham are “only as sovereign as the federal government allows us to be,”
  • Leading Democrats have argued for the development of an ever-more sophisticated border surveillance state as an alternative to Trump’s border wall. “The positive, shall we say, almost technological wall that can be built is what we should be doing,” House Speaker Nancy Pelosi said in January. But for those crossing the border, the development of this surveillance apparatus has already taken a heavy toll. In January, a study published by researchers from the University of Arizona and Earlham College found that border surveillance towers have prompted migrants to cross along more rugged and circuitous pathways, leading to greater numbers of deaths from dehydration, exhaustion, and exposure.
  • “Walls are not only a question of blocking people from moving, but they are also serving as borders or frontiers between where you enter the surveillance state,” she said. “The idea is that at the very moment you step near the border, Elbit will catch you. Something similar happens in Palestine.”
  • CBP is by far the largest law enforcement entity in the U.S., with 61,400 employees and a 2018 budget of $16.3 billion — more than the militaries of Iran, Mexico, Israel, and Pakistan. The Border Patrol has jurisdiction 100 miles inland from U.S. borders, making roughly two-thirds of the U.S. population theoretically subject to its operations, including the entirety of the Tohono O’odham reservation
  • Between 2013 and 2016, for example, roughly 40 percent of Border Patrol seizures at immigration enforcement checkpoints involved 1 ounce or less of marijuana confiscated from U.S. citizens.
  • the agency uses its sprawling surveillance apparatus for purposes other than border enforcement
  • documents obtained via public records requests suggest that CBP drone flights included surveillance of Dakota Access pipeline protests
  • CBP’s repurposing of the surveillance tower and drones to surveil dissidents hints at other possible abuses. “It’s a reminder that technologies that are sold for one purpose, such as protecting the border or stopping terrorists — or whatever the original justification may happen to be — so often get repurposed for other reasons, such as targeting protesters.”
  • The impacts of the U.S. border on Tohono O’odham people date to the mid-19th century. The tribal nation’s traditional land extended 175 miles into Mexico before being severed by the 1853 Gadsden Purchase, a U.S. acquisition of land from the Mexican government. As many as 2,500 of the tribe’s more than 30,000 members still live on the Mexican side. Tohono O’odham people used to travel between the United States and Mexico fairly easily on roads without checkpoints to visit family, perform ceremonies, or obtain health care. But that was before the Border Patrol arrived en masse in the mid-2000s, turning the reservation into something akin to a military occupation zone. Residents say agents have administered beatings, used pepper spray, pulled people out of vehicles, shot two Tohono O’odham men under suspicious circumstances, and entered people’s homes without warrants. “It is apartheid here,” Ofelia Rivas says. “We have to carry our papers everywhere. And everyone here has experienced the Border Patrol’s abuse in some way.”
  • Tohono O’odham people have developed common cause with other communities struggling against colonization and border walls. David is among numerous activists from the U.S. and Mexican borderlands who joined a delegation to the West Bank in 2017, convened by Stop the Wall, to build relationships and learn about the impacts of Elbit’s surveillance systems. “I don’t feel safe with them taking over my community, especially if you look at what’s going on in Palestine — they’re bringing the same thing right over here to this land,” she says. “The U.S. government is going to be able to surveil basically anybody on the nation.”
Ed Webb

Google and Apple Digital Mapping | Data Collection - 0 views

  • There is a sense, in fact, in which mapping is the essence of what Google does. The company likes to talk about services such as Maps and Earth as if they were providing them for fun - a neat, free extra as a reward for using their primary offering, the search box. But a search engine, in some sense, is an attempt to map the world of information - and when you can combine that conceptual world with the geographical one, the commercial opportunities suddenly explode.
  • it's hard to interpret the occasional aerial snapshot of your garden as a big issue when the phone in your pocket is assembling a real-time picture of your movements, preferences and behaviour
  • There's no technical reason why, perhaps in return for a cheaper phone bill, you mightn't consent to be shown not the quickest route between two points, but the quickest route that passes at least one Starbucks. If you're looking at the world through Google glasses, who determines which aspects of "augmented reality" data you see - and did they pay for the privilege?
  • ...6 more annotations...
  • "The map is mapping us," says Martin Dodge, a senior lecturer in human geography at Manchester University. "I'm not paranoid, but I am quite suspicious and cynical about products that appear to be innocent and neutral, but that are actually vacuuming up all kinds of behavioural and attitudinal data."
  • In a world of GPS-enabled smartphones, you're not just consulting Google's or Apple's data stores when you consult a map: you're adding to them.
  • "There's kind of a fine line that you run," said Ed Parsons, Google's chief geospatial technologist, in a session at the Aspen Ideas Festival in Colorado, "between this being really useful, and it being creepy."
  • "Google and Apple are saying that they want control over people's real and imagined space."
  • It can be easy to assume that maps are objective: that the world is out there, and that a good map is one that represents it accurately. But that's not true. Any square kilometre of the planet can be described in an infinite number of ways: in terms of its natural features, its weather, its socio-economic profile, or what you can buy in the shops there. Traditionally, the interests reflected in maps have been those of states and their armies, because they were the ones who did the map-making, and the primary use of many such maps was military. (If you had the better maps, you stood a good chance of winning the battle. The logo of Britain's Ordnance Survey still includes a visual reference to the 18th-century War Department.) Now, the power is shifting. "Every map," the cartography curator Lucy Fellowes once said, "is someone's way of getting you to look at the world his or her way."
  • The question cartographers are always being asked at cocktail parties, says Heyman, is whether there's really any map-making still left to do: we've mapped the whole planet already, haven't we? The question could hardly be more misconceived. We are just beginning to grasp what it means to live in a world in which maps are everywhere - and in which, by using maps, we are mapped ourselves.
Ed Webb

Six scientists tell us about the most accurate science fiction in their fields - 0 views

  • David Barash, Evolutionary Psychologist, University of Washington: I am hard-pressed to identify any sci-fi works that make use of evolutionary psychology directly, or even that fit neatly into its scientific world-view. Some possibilities include those books that have made use of the concept of selective breeding for particular behavioral inclinations: Dune comes to mind, and of course, before that, Brave New World. Although evo-psych presumes genetic influence on behavior, it definitely doesn't imply anything like the genetic determinism found in either of these. In that sense, these books are more like a mis-use of evo-psych, likely to confirm the worst fears of readers who don't understand the science itself. Another case of this would be Margaret Atwood's The Handmaid's Tale, which derived from the author's mis-reading of what was then called sociobiology - specifically, her assumption that a science that examined male-female differences (among other things) was also prescribing and exaggerating these differences.
Ed Webb

What killed Caprica? - 0 views

  • Caprica may have gone too far, tried to cover too much. It broke one of the cardinal rules of mainstream science fiction, which is that if you have a strange alternate universe you'd better populate it with recognizable, ordinary characters. But I like the kind of thought-experiment audaciousness that says, Hell yes we are going to give you complicated characters who defy stereotypes, and put them in a world whose rules you'll have to think hard to understand. It's too late to bring Caprica back. But I hope that this show is the first part of a new wave of science fiction on TV. Like The Sarah Connor Chronicles, Dollhouse, and Fringe, Caprica tackles singularity-level technology as a political and economic phenomenon - not as an escapist fantasy. And that's why it was a show worth watching, even when it stumbled.
Ed Webb

News: Cheating and the Generational Divide - Inside Higher Ed - 0 views

  • such attitudes among students can develop from the notion that all of education can be distilled into performance on a test -- which today's college students have absorbed from years of schooling under No Child Left Behind -- and not that education is a process in which one grapples with difficult material.
    • Ed Webb
       
      Exactly so. If the focus of education is moved away from testing regurgitated factoids and toward building genuine skills of critical analysis and effective communication, the apparent 'gap' in understanding of what cheating is will surely go away.
  •  
    I'd love to know what you Dystopians think about this.
  •  
    Institutional education puts far too much pressure on students to do well in tests. This I believe forces students to cheat because if you do not perform well in this one form of evaluation you are clearly not educated well enough, not trying hard enough or just plain dumb. I doubt there are many instances outside of institutional education where you would need to memorize a number of facts for a small period of time where your very future is at stake. To me the only cheating is plagarism. If you're taking a standardized test and you don't know the answer to question 60 but the student next to you does how would it hurt anyone to share that answer? You're learning the answer to question 60. It's the same knowledge you'll learn when you get the test back and realize the answer to 60 was A not B. Again though, when will this scenario occur outside of schooling?
Ed Webb

Ted Turner urges global one-child policy to save planet - The Globe and Mail - 1 views

  • Climate change and population control can make for a politically explosive mix, as media mogul Ted Turner demonstrated Sunday when he urged world leaders to institute a global one-child policy to save the Earth’s environment.Mr. Turner spoke at a luncheon where economist Brian O’Neill from the U.S.’s National Center for Atmospheric Research unveiled his study on the impact of demographic trends on future greenhouse gas emission, a little-discussed subject given its political sensitivity.
  • fertility rights could be sold so that poor people could profit from their decision not to reproduce
  • Mary Robinson warned that radical prescriptions for population control would backfire, ensuring that the subject will remain off the agenda of international climate talks.“If we do it the wrong way, we can divide the world,” Ms. Robinson said. “A lot of people in the climate world could communicate this very badly.”
  • ...1 more annotation...
  • Mr. O’Neill said he was not advocating any particular policy, although he noted that global surveys suggest there is a vast, unmet demand for family planning, and just making contraception universally available on a voluntary basis would drive down the birth rate
Ed Webb

Writing in College - 1. Some crucial differences between high school and college writing - 0 views

  • some students ask why they should be required to convince anyone of anything. "After all," they say, "we are all entitled to our opinions, so all we should have to do is express them clearly. Here's my opinion. Take it or leave it." This point of view both misunderstands the nature of argument and ignores its greatest value
    • Ed Webb
       
      Key!
  • In an Age of Information, what most professionals do is research, think, and make arguments. (And part of the value of doing your own thinking and writing is that it makes you much better at evaluating the thinking and writing of others.)
    • Ed Webb
       
      Both parts of this - both sentences - are important.
  • Words such as "show how" and "explain" and "illustrate" do not ask you to summarize a reading. They ask you to show how the reading is put together, how it works
  • ...3 more annotations...
  • A third kind of assignment is simultaneously least restrictive and most intimidating. These assignments leave it up to you to decide not only what you will claim but what you will write about and even what kind of analysis you will do: "Analyze the role of a character in The Odyssey." That is the kind of assignment that causes many students anxiety because they must motivate their research almost entirely on their own. To meet this kind of assignment, the best advice we can give is to read with your mind open to things that puzzle you, that make you wish you understood something better. Now that advice may seem almost counterproductive; you may even think that being puzzled or not understanding something testifies to your intellectual failure. Yet almost everything we do in a university starts with someone being puzzled about something, someone with a vague--or specific--dissatisfaction caused by not knowing something that seems important or by wanting to understand something better. The best place to begin thinking about any assignment is with what you don't understand but wish you did.
  • If after all this analysis of the assignment you are still uncertain about what is expected of you, ask your instructor. If your class has a Writing Intern, ask that person. If for some reason you can't ask either, locate the Academic Tutor in your residence hall and ask that person. Do this as soon as possible.
  • you will only rarely be able state good points like these before you write your first draft. Much more often, you discover good points at the end of the process of drafting. Writing is a way of thinking through a problem, of discovering what you want to say. So do not feel that you should begin to write only when you have a fully articulated point in mind. Instead, write to discover and to refine it
Ed Webb

Artificial Intelligence and the Future of Humans | Pew Research Center - 0 views

  • experts predicted networked artificial intelligence will amplify human effectiveness but also threaten human autonomy, agency and capabilities
  • most experts, regardless of whether they are optimistic or not, expressed concerns about the long-term impact of these new tools on the essential elements of being human. All respondents in this non-scientific canvassing were asked to elaborate on why they felt AI would leave people better off or not. Many shared deep worries, and many also suggested pathways toward solutions. The main themes they sounded about threats and remedies are outlined in the accompanying table.
  • CONCERNS Human agency: Individuals are  experiencing a loss of control over their lives Decision-making on key aspects of digital life is automatically ceded to code-driven, "black box" tools. People lack input and do not learn the context about how the tools work. They sacrifice independence, privacy and power over choice; they have no control over these processes. This effect will deepen as automated systems become more prevalent and complex. Data abuse: Data use and surveillance in complex systems is designed for profit or for exercising power Most AI tools are and will be in the hands of companies striving for profits or governments striving for power. Values and ethics are often not baked into the digital systems making people's decisions for them. These systems are globally networked and not easy to regulate or rein in. Job loss: The AI takeover of jobs will widen economic divides, leading to social upheaval The efficiencies and other economic advantages of code-based machine intelligence will continue to disrupt all aspects of human work. While some expect new jobs will emerge, others worry about massive job losses, widening economic divides and social upheavals, including populist uprisings. Dependence lock-in: Reduction of individuals’ cognitive, social and survival skills Many see AI as augmenting human capacities but some predict the opposite - that people's deepening dependence on machine-driven networks will erode their abilities to think for themselves, take action independent of automated systems and interact effectively with others. Mayhem: Autonomous weapons, cybercrime and weaponized information Some predict further erosion of traditional sociopolitical structures and the possibility of great loss of lives due to accelerated growth of autonomous military applications and the use of weaponized information, lies and propaganda to dangerously destabilize human groups. Some also fear cybercriminals' reach into economic systems.
  • ...18 more annotations...
  • AI and ML [machine learning] can also be used to increasingly concentrate wealth and power, leaving many people behind, and to create even more horrifying weapons
  • “In 2030, the greatest set of questions will involve how perceptions of AI and their application will influence the trajectory of civil rights in the future. Questions about privacy, speech, the right of assembly and technological construction of personhood will all re-emerge in this new AI context, throwing into question our deepest-held beliefs about equality and opportunity for all. Who will benefit and who will be disadvantaged in this new world depends on how broadly we analyze these questions today, for the future.”
  • SUGGESTED SOLUTIONS Global good is No. 1: Improve human collaboration across borders and stakeholder groups Digital cooperation to serve humanity's best interests is the top priority. Ways must be found for people around the world to come to common understandings and agreements - to join forces to facilitate the innovation of widely accepted approaches aimed at tackling wicked problems and maintaining control over complex human-digital networks. Values-based system: Develop policies to assure AI will be directed at ‘humanness’ and common good Adopt a 'moonshot mentality' to build inclusive, decentralized intelligent digital networks 'imbued with empathy' that help humans aggressively ensure that technology meets social and ethical responsibilities. Some new level of regulatory and certification process will be necessary. Prioritize people: Alter economic and political systems to better help humans ‘race with the robots’ Reorganize economic and political systems toward the goal of expanding humans' capacities and capabilities in order to heighten human/AI collaboration and staunch trends that would compromise human relevance in the face of programmed intelligence.
  • “I strongly believe the answer depends on whether we can shift our economic systems toward prioritizing radical human improvement and staunching the trend toward human irrelevance in the face of AI. I don’t mean just jobs; I mean true, existential irrelevance, which is the end result of not prioritizing human well-being and cognition.”
  • We humans care deeply about how others see us – and the others whose approval we seek will increasingly be artificial. By then, the difference between humans and bots will have blurred considerably. Via screen and projection, the voice, appearance and behaviors of bots will be indistinguishable from those of humans, and even physical robots, though obviously non-human, will be so convincingly sincere that our impression of them as thinking, feeling beings, on par with or superior to ourselves, will be unshaken. Adding to the ambiguity, our own communication will be heavily augmented: Programs will compose many of our messages and our online/AR appearance will [be] computationally crafted. (Raw, unaided human speech and demeanor will seem embarrassingly clunky, slow and unsophisticated.) Aided by their access to vast troves of data about each of us, bots will far surpass humans in their ability to attract and persuade us. Able to mimic emotion expertly, they’ll never be overcome by feelings: If they blurt something out in anger, it will be because that behavior was calculated to be the most efficacious way of advancing whatever goals they had ‘in mind.’ But what are those goals?
  • AI will drive a vast range of efficiency optimizations but also enable hidden discrimination and arbitrary penalization of individuals in areas like insurance, job seeking and performance assessment
  • The record to date is that convenience overwhelms privacy
  • As AI matures, we will need a responsive workforce, capable of adapting to new processes, systems and tools every few years. The need for these fields will arise faster than our labor departments, schools and universities are acknowledging
  • AI will eventually cause a large number of people to be permanently out of work
  • Newer generations of citizens will become more and more dependent on networked AI structures and processes
  • there will exist sharper divisions between digital ‘haves’ and ‘have-nots,’ as well as among technologically dependent digital infrastructures. Finally, there is the question of the new ‘commanding heights’ of the digital network infrastructure’s ownership and control
  • As a species we are aggressive, competitive and lazy. We are also empathic, community minded and (sometimes) self-sacrificing. We have many other attributes. These will all be amplified
  • Given historical precedent, one would have to assume it will be our worst qualities that are augmented
  • Our capacity to modify our behaviour, subject to empathy and an associated ethical framework, will be reduced by the disassociation between our agency and the act of killing
  • We cannot expect our AI systems to be ethical on our behalf – they won’t be, as they will be designed to kill efficiently, not thoughtfully
  • the Orwellian nightmare realised
  • “AI will continue to concentrate power and wealth in the hands of a few big monopolies based on the U.S. and China. Most people – and parts of the world – will be worse off.”
  • The remainder of this report is divided into three sections that draw from hundreds of additional respondents’ hopeful and critical observations: 1) concerns about human-AI evolution, 2) suggested solutions to address AI’s impact, and 3) expectations of what life will be like in 2030, including respondents’ positive outlooks on the quality of life and the future of work, health care and education
Ed Webb

WIRED - 0 views

  • Over the past two years, RealNetworks has developed a facial recognition tool that it hopes will help schools more accurately monitor who gets past their front doors. Today, the company launched a website where school administrators can download the tool, called SAFR, for free and integrate it with their own camera systems
  • how to balance privacy and security in a world that is starting to feel like a scene out of Minority Report
  • facial recognition technology often misidentifies black people and women at higher rates than white men
  • ...7 more annotations...
  • "The use of facial recognition in schools creates an unprecedented level of surveillance and scrutiny," says John Cusick, a fellow at the Legal Defense Fund. "It can exacerbate racial disparities in terms of how schools are enforcing disciplinary codes and monitoring their students."
  • The school would ask adults, not kids, to register their faces with the SAFR system. After they registered, they’d be able to enter the school by smiling at a camera at the front gate. (Smiling tells the software that it’s looking at a live person and not, for instance, a photograph). If the system recognizes the person, the gates automatically unlock
  • The software can predict a person's age and gender, enabling schools to turn off access for people below a certain age. But Glaser notes that if other schools want to register students going forward, they can
  • There are no guidelines about how long the facial data gets stored, how it’s used, or whether people need to opt in to be tracked.
  • Schools could, for instance, use facial recognition technology to monitor who's associating with whom and discipline students differently as a result. "It could criminalize friendships," says Cusick of the Legal Defense Fund.
  • SAFR boasts a 99.8 percent overall accuracy rating, based on a test, created by the University of Massachusetts, that vets facial recognition systems. But Glaser says the company hasn’t tested whether the tool is as good at recognizing black and brown faces as it is at recognizing white ones. RealNetworks deliberately opted not to have the software proactively predict ethnicity, the way it predicts age and gender, for fear of it being used for racial profiling. Still, testing the tool's accuracy among different demographics is key. Research has shown that many top facial recognition tools are particularly bad at recognizing black women
  • "It's tempting to say there's a technological solution, that we're going to find the dangerous people, and we're going to stop them," she says. "But I do think a large part of that is grasping at straws."
Ed Webb

Could fully automated luxury communism ever work? - 0 views

  • Having achieved a seamless, pervasive commodification of online sociality, Big Tech companies have turned their attention to infrastructure. Attempts by Google, Amazon and Facebook to achieve market leadership, in everything from AI to space exploration, risk a future defined by the battle for corporate monopoly.
  • The technologies are coming. They’re already here in certain instances. It’s the politics that surrounds them. We have alternatives: we can have public ownership of data in the citizen’s interest or it could be used as it is in China where you have a synthesis of corporate and state power
  • the two alternatives that big data allows is an all-consuming surveillance state where you have a deep synthesis of capitalism with authoritarian control, or a reinvigorated welfare state where more and more things are available to everyone for free or very low cost
  • ...4 more annotations...
  • we can’t begin those discussions until we say, as a society, we want to at least try subordinating these potentials to the democratic project, rather than allow capitalism to do what it wants
  • I say in FALC that this isn’t a blueprint for utopia. All I’m saying is that there is a possibility for the end of scarcity, the end of work, a coming together of leisure and labour, physical and mental work. What do we want to do with it? It’s perfectly possible something different could emerge where you have this aggressive form of social value.
  • I think the thing that’s been beaten out of everyone since 2010 is one of the prevailing tenets of neoliberalism: work hard, you can be whatever you want to be, that you’ll get a job, be well paid and enjoy yourself.  In 2010, that disappeared overnight, the rules of the game changed. For the status quo to continue to administer itself,  it had to change common sense. You see this with Jordan Peterson; he’s saying you have to know your place and that’s what will make you happy. To me that’s the only future for conservative thought, how else do you mediate the inequality and unhappiness?
  • I don’t think we can rapidly decarbonise our economies without working people understanding that it’s in their self-interest. A green economy means better quality of life. It means more work. Luxury populism feeds not only into the green transition, but the rollout of Universal Basic Services and even further.
Ed Webb

Can Sci-Fi Writers Prepare Us for an Uncertain Future? | WIRED - 0 views

  • a growing contingent of sci-fi writers being hired by think tanks, politicians, and corporations to imagine—and predict—the future
  • Harvard Business Review made the corporate case for reading sci-fi years ago, and mega consulting firm Price Waterhouse Cooper published a guide on how to use sci-fi to “explore innovation.” The New Yorker has touted “better business through sci-fi.” As writer Brian Merchant put it, “Welcome to the Sci-Fi industrial complex.”
  • The use of sci-fi has bled into government and public policy spheres. The New America Foundation recently held an all-day event discussing “What Sci-Fi Futures Can (and Can't) Teach Us About AI Policy.” And Nesta, an organization that generates speculative fiction, has committed $24 million to grow “new models of public services” in collaboration with the UK government
  • ...8 more annotations...
  • Some argue that there is power in narrative stories that can’t be found elsewhere. Others assert that in our quest for imagination and prediction, we’re deluding ourselves into thinking that we can predict what’s coming
  • The World Future Society and the Association of Professional Futurists represent a small but growing group of professionals, many of whom have decades of experience thinking about long-term strategy and “scenario planning”—a method used by organizations to try and prepare for possible futures.
  • true Futurism is often pretty unsexy. It involves sifting through a lot of data and research and models and spreadsheets. Nobody is going to write a profile of your company or your government project based on a dry series of models outlining carefully caveated possibilities. On the other hand, worldbuilding—the process of imagining a universe in which your fictional stories can exist—is fun. People want stories, and science fiction writers can provide them.
  • Are those who write epic space operas (no matter how good those space operas might be) really the right people to ask about the future of work or water policy or human rights?
  • critics worry that writers are so good at spinning stories that they might even convince you those stories are true. In actuality, history shows us that predictions are nearly impossible to make and that humans are catastrophically bad at guessing what the future will hold
  • it's important to distinguish between prediction and impact. Did Star Trek anticipate the cell phone, or were the inventors of the cell phone inspired by Star Trek? Listicles of “all the things sci-fi has predicted” are largely exercises in cherry picking—they never include the things that sci-fi got wrong
  • In this line of work, specifics matter. It’s one thing to write a book about a refugee crisis, but quite another to predict exactly how the Syrian refugee crisis unfolded
  • It’s tempting to turn to storytelling in times of crisis—and it’s hard to argue that we’re not in a time of crisis now. Within dystopian pieces of fiction there are heroes and survivors, characters we can identify with who come out the other side and make it out OK. Companies and governments and individuals all want to believe that they will be among those lucky few, the heroes of the story. And science fiction writers can deliver that, for a fee.
Ed Webb

Stephen's Web ~ Over 50% of Google searches result in no clicks, data shows ~ Stephen D... - 0 views

  • should Google be able to insert its services between yourself and your search target? What would we say if, say, 'Google University' offered to book courses with selected partners to people searching 'learn philosophy'? What about job search firms?
1 - 20 of 24 Next ›
Showing 20 items per page