Skip to main content

Home/ Dystopias/ Group items tagged words

Rss Feed Group items tagged

Ed Webb

How to Mark a Book - 0 views

  • A book is more like the score of a piece of music than it is like a painting. No great musician confuses a symphony with the printed sheets of music. Arturo Toscanini reveres Brahms, but Toscanini's score of the G minor Symphony is so thoroughly marked up that no one but the maestro himself can read it. The reason why a great conductor makes notations on his musical scores -- marks them up again and again each time he returns to study them--is the reason why you should mark your books.
    • Ed Webb
       
      This is an excellent analogy.
  • the physical act of writing, with your own hand, brings words and sentences more sharply before your mind and preserves them better in your memory. To set down your reaction to important words and sentences you have read, and the questions they have raised in your mind, is to preserve those reactions and sharpen those questions.
    • Ed Webb
       
      The effect of new technologies here is still imperfectly understood. But there is some evidence that typing notes is less efficacious than handwriting them, in terms of inscribing information to memory and developing thought.
  • that is exactly what reading a book should be: a conversation between you and the author. Presumably he knows more about the subject than you do; naturally, you'll have the proper humility as you approach him. But don't let anybody tell you that a reader is supposed to be solely on the receiving end. Understanding is a two-way operation; learning doesn't consist in being an empty receptacle. The learner has to question himself and question the teacher. He even has to argue with the teacher, once he understands what the teacher is saying. And marking a book is literally an expression of differences, or agreements of opinion, with the author
  • ...4 more annotations...
  • Underlining (or highlighting): of major points, of important or forceful statements. Vertical lines at the margin: to emphasize a statement already underlined. Star, asterisk, or other doo-dad at the margin: to be used sparingly, to emphasize the ten or twenty most important statements in the book. (You may want to fold the bottom comer of each page on which you use such marks. It won't hurt the sturdy paper on which most modern books are printed, and you will be able take the book off the shelf at any time and, by opening it at the folded-corner page, refresh your recollection of the book.) Numbers in the margin: to indicate the sequence of points the author makes in developing a single argument. Numbers of other pages in the margin: to indicate where else in the book the author made points relevant to the point marked; to tie up the ideas in a book, which, though they may be separated by many pages, belong together. Circling or highlighting of key words or phrases. Writing in the margin, or at the top or bottom of the page, for the sake of: recording questions (and perhaps answers) which a passage raised in your mind; reducing a complicated discussion to a simple statement; recording the sequence of major points right through the books. I use the end-papers at the back of the book to make a personal index of the author's points in the order of their appearance.
    • Ed Webb
       
      This is a good schema. You can develop your own that accomplishes the same. The key is to have a schema and apply it consistently.
  • you may say that this business of marking books is going to slow up your reading. It probably will. That's one of the reasons for doing it
  • Some things should be read quickly and effortlessly and some should be read slowly and even laboriously.
  • Why is marking up a book indispensable to reading? First, it keeps you awake. (And I don't mean merely conscious; I mean awake.) In the second place; reading, if it is active, is thinking, and thinking tends to express itself in words, spoken or written. The marked book is usually the thought-through book. Finally, writing helps you remember the thoughts you had, or the thoughts the author expressed.
Ed Webb

The Web Means the End of Forgetting - NYTimes.com - 1 views

  • for a great many people, the permanent memory bank of the Web increasingly means there are no second chances — no opportunities to escape a scarlet letter in your digital past. Now the worst thing you’ve done is often the first thing everyone knows about you.
  • a collective identity crisis. For most of human history, the idea of reinventing yourself or freely shaping your identity — of presenting different selves in different contexts (at home, at work, at play) — was hard to fathom, because people’s identities were fixed by their roles in a rigid social hierarchy. With little geographic or social mobility, you were defined not as an individual but by your village, your class, your job or your guild. But that started to change in the late Middle Ages and the Renaissance, with a growing individualism that came to redefine human identity. As people perceived themselves increasingly as individuals, their status became a function not of inherited categories but of their own efforts and achievements. This new conception of malleable and fluid identity found its fullest and purest expression in the American ideal of the self-made man, a term popularized by Henry Clay in 1832.
  • the dawning of the Internet age promised to resurrect the ideal of what the psychiatrist Robert Jay Lifton has called the “protean self.” If you couldn’t flee to Texas, you could always seek out a new chat room and create a new screen name. For some technology enthusiasts, the Web was supposed to be the second flowering of the open frontier, and the ability to segment our identities with an endless supply of pseudonyms, avatars and categories of friendship was supposed to let people present different sides of their personalities in different contexts. What seemed within our grasp was a power that only Proteus possessed: namely, perfect control over our shifting identities. But the hope that we could carefully control how others view us in different contexts has proved to be another myth. As social-networking sites expanded, it was no longer quite so easy to have segmented identities: now that so many people use a single platform to post constant status updates and photos about their private and public activities, the idea of a home self, a work self, a family self and a high-school-friends self has become increasingly untenable. In fact, the attempt to maintain different selves often arouses suspicion.
  • ...20 more annotations...
  • All around the world, political leaders, scholars and citizens are searching for responses to the challenge of preserving control of our identities in a digital world that never forgets. Are the most promising solutions going to be technological? Legislative? Judicial? Ethical? A result of shifting social norms and cultural expectations? Or some mix of the above?
  • These approaches share the common goal of reconstructing a form of control over our identities: the ability to reinvent ourselves, to escape our pasts and to improve the selves that we present to the world.
  • many technological theorists assumed that self-governing communities could ensure, through the self-correcting wisdom of the crowd, that all participants enjoyed the online identities they deserved. Wikipedia is one embodiment of the faith that the wisdom of the crowd can correct most mistakes — that a Wikipedia entry for a small-town mayor, for example, will reflect the reputation he deserves. And if the crowd fails — perhaps by turning into a digital mob — Wikipedia offers other forms of redress
  • In practice, however, self-governing communities like Wikipedia — or algorithmically self-correcting systems like Google — often leave people feeling misrepresented and burned. Those who think that their online reputations have been unfairly tarnished by an isolated incident or two now have a practical option: consulting a firm like ReputationDefender, which promises to clean up your online image. ReputationDefender was founded by Michael Fertik, a Harvard Law School graduate who was troubled by the idea of young people being forever tainted online by their youthful indiscretions. “I was seeing articles about the ‘Lord of the Flies’ behavior that all of us engage in at that age,” he told me, “and it felt un-American that when the conduct was online, it could have permanent effects on the speaker and the victim. The right to new beginnings and the right to self-definition have always been among the most beautiful American ideals.”
  • In the Web 3.0 world, Fertik predicts, people will be rated, assessed and scored based not on their creditworthiness but on their trustworthiness as good parents, good dates, good employees, good baby sitters or good insurance risks.
  • “Our customers include parents whose kids have talked about them on the Internet — ‘Mom didn’t get the raise’; ‘Dad got fired’; ‘Mom and Dad are fighting a lot, and I’m worried they’ll get a divorce.’ ”
  • as facial-recognition technology becomes more widespread and sophisticated, it will almost certainly challenge our expectation of anonymity in public
  • Ohm says he worries that employers would be able to use social-network-aggregator services to identify people’s book and movie preferences and even Internet-search terms, and then fire or refuse to hire them on that basis. A handful of states — including New York, California, Colorado and North Dakota — broadly prohibit employers from discriminating against employees for legal off-duty conduct like smoking. Ohm suggests that these laws could be extended to prevent certain categories of employers from refusing to hire people based on Facebook pictures, status updates and other legal but embarrassing personal information. (In practice, these laws might be hard to enforce, since employers might not disclose the real reason for their hiring decisions, so employers, like credit-reporting agents, might also be required by law to disclose to job candidates the negative information in their digital files.)
  • research group’s preliminary results suggest that if rumors spread about something good you did 10 years ago, like winning a prize, they will be discounted; but if rumors spread about something bad that you did 10 years ago, like driving drunk, that information has staying power
  • many people aren’t worried about false information posted by others — they’re worried about true information they’ve posted about themselves when it is taken out of context or given undue weight. And defamation law doesn’t apply to true information or statements of opinion. Some legal scholars want to expand the ability to sue over true but embarrassing violations of privacy — although it appears to be a quixotic goal.
  • Researchers at the University of Washington, for example, are developing a technology called Vanish that makes electronic data “self-destruct” after a specified period of time. Instead of relying on Google, Facebook or Hotmail to delete the data that is stored “in the cloud” — in other words, on their distributed servers — Vanish encrypts the data and then “shatters” the encryption key. To read the data, your computer has to put the pieces of the key back together, but they “erode” or “rust” as time passes, and after a certain point the document can no longer be read.
  • Plenty of anecdotal evidence suggests that young people, having been burned by Facebook (and frustrated by its privacy policy, which at more than 5,000 words is longer than the U.S. Constitution), are savvier than older users about cleaning up their tagged photos and being careful about what they post.
  • norms are already developing to recreate off-the-record spaces in public, with no photos, Twitter posts or blogging allowed. Milk and Honey, an exclusive bar on Manhattan’s Lower East Side, requires potential members to sign an agreement promising not to blog about the bar’s goings on or to post photos on social-networking sites, and other bars and nightclubs are adopting similar policies. I’ve been at dinners recently where someone has requested, in all seriousness, “Please don’t tweet this” — a custom that is likely to spread.
  • There’s already a sharp rise in lawsuits known as Twittergation — that is, suits to force Web sites to remove slanderous or false posts.
  • strategies of “soft paternalism” that might nudge people to hesitate before posting, say, drunken photos from Cancún. “We could easily think about a system, when you are uploading certain photos, that immediately detects how sensitive the photo will be.”
  • It’s sobering, now that we live in a world misleadingly called a “global village,” to think about privacy in actual, small villages long ago. In the villages described in the Babylonian Talmud, for example, any kind of gossip or tale-bearing about other people — oral or written, true or false, friendly or mean — was considered a terrible sin because small communities have long memories and every word spoken about other people was thought to ascend to the heavenly cloud. (The digital cloud has made this metaphor literal.) But the Talmudic villages were, in fact, far more humane and forgiving than our brutal global village, where much of the content on the Internet would meet the Talmudic definition of gossip: although the Talmudic sages believed that God reads our thoughts and records them in the book of life, they also believed that God erases the book for those who atone for their sins by asking forgiveness of those they have wronged. In the Talmud, people have an obligation not to remind others of their past misdeeds, on the assumption they may have atoned and grown spiritually from their mistakes. “If a man was a repentant [sinner],” the Talmud says, “one must not say to him, ‘Remember your former deeds.’ ” Unlike God, however, the digital cloud rarely wipes our slates clean, and the keepers of the cloud today are sometimes less forgiving than their all-powerful divine predecessor.
  • On the Internet, it turns out, we’re not entitled to demand any particular respect at all, and if others don’t have the empathy necessary to forgive our missteps, or the attention spans necessary to judge us in context, there’s nothing we can do about it.
  • Gosling is optimistic about the implications of his study for the possibility of digital forgiveness. He acknowledged that social technologies are forcing us to merge identities that used to be separate — we can no longer have segmented selves like “a home or family self, a friend self, a leisure self, a work self.” But although he told Facebook, “I have to find a way to reconcile my professor self with my having-a-few-drinks self,” he also suggested that as all of us have to merge our public and private identities, photos showing us having a few drinks on Facebook will no longer seem so scandalous. “You see your accountant going out on weekends and attending clown conventions, that no longer makes you think that he’s not a good accountant. We’re coming to terms and reconciling with that merging of identities.”
  • a humane society values privacy, because it allows people to cultivate different aspects of their personalities in different contexts; and at the moment, the enforced merging of identities that used to be separate is leaving many casualties in its wake.
  • we need to learn new forms of empathy, new ways of defining ourselves without reference to what others say about us and new ways of forgiving one another for the digital trails that will follow us forever
Ed Webb

Charlie Brooker | Google Instant is trying to kill me | Comment is free | The Guardian - 0 views

  • I'm starting to feel like an unwitting test subject in a global experiment conducted by Google, in which it attempts to discover how much raw information it can inject directly into my hippocampus before I crumple to the floor and start fitting uncontrollably.
  • It's the internet on fast-forward, and it's aggressive – like trying to order from a waiter who keeps finishing your sentences while ramming spoonfuls of what he thinks you want directly into your mouth, so you can't even enjoy your blancmange without chewing a gobful of black pudding first.
  • Google may have released him from the physical misery of pressing enter, but it's destroyed his sense of perspective in the process.
  • ...2 more annotations...
  • My attention span was never great, but modern technology has halved it, and halved it again, and again and again, down to an atomic level, and now there's nothing discernible left. Back in that room, bombarded by alerts and emails, repeatedly tapping search terms into Google Instant for no good reason, playing mindless pinball with words and images, tumbling down countless little attention-vortexes, plunging into one split-second coma after another, I began to feel I was neither in control nor 100% physically present. I wasn't using the computer. The computer was using me – to keep its keys warm.
  • I'm rationing my internet usage and training my mind muscles for the future. Because I can see where it's heading: a service called Google Assault that doesn't even bother to guess what you want, and simply hurls random words and sounds and images at you until you dribble all the fluid out of your body. And I know it'll kill me, unless I train my brain to withstand and ignore it. For me, the war against the machines has started in earnest.
Ed Webb

FastFiction - Intravenous Electric Fire - 7 views

  •  
    Can you evoke a world in 200 words?
Ed Webb

Surveillant Society - 0 views

  • The assumption that one is not being recorded in any real way, a standard in civilization for more or less all of history, is being overturned.
  • CEOs have become slaves to the PR department in a bizarre inversion of internal corporate checks and balances
  • “Your word against mine” can be a serious and drawn-out dispute, subject to all kinds of subjective judgments, loyalties, rights, and arguments; “Your word against my high-definition video” gives citizens and the vulnerable a bit more leverage.
  • ...8 more annotations...
  • The issue of ownership is being muddied by the same process that has upended media industries – the transition of recordable data from physical to virtual property, infinitely copyable but still subject to many of the necessities of more traditionally-held items. Who owns what, who is legally bound to act in which way, which licenses supercede others? A team of lawyers and scholars might spend months putting together a cohesive argument for any number of possibilities. What chance does an end user have to figure out whether or not they have the right to print, distribute, delete, and so on?
  • if you show everything, you’re likely to show something you should have hidden, and if you hide everything, everyone will assume you did so for a reason
  • Hoaxes, fakes, set-ups, staged scenarios, creative editing, post-production, photoshopping, and every other tool of the trade, all show something other than the raw, original product. I’m not familiar with forensic digital media evaluation tools in use today, but I get the feeling that if they’re not inadequate now, they will be so in a few years.
  • our responsibilities as a society to use these new tools judiciously and responsibly
  • increasingly, the answers to these questions are tending towards the “record first” mentality
  • The logical next step, after assuming one is being recorded at all times when in public (potentially true) is ensuring one is being recorded at all times when in public. Theoretically, you won’t act any differently, since you’re already operating under that assumption.
  • how long before it’s considered negligent to have not recorded an accident or criminal act?
  • You have no privacy in public, haven’t had any for a long time, and what little you have you tend to give away. But the sword is double-edged; shouldn’t we benefit from that as well as suffer? A surveillance society is watched. A surveillant society is watching.
Ed Webb

American white people really hate being called "white people" - Vox - 0 views

  • as research on “priming” shows, simply discussing race at all kicks up those effects among the racially dominant group. Or to put it more bluntly, in the US context: White people really don’t like being called white people. They don’t like being reminded that they are white people, part of a group with discernible boundaries, shared interests, and shared responsibilities
  • one of the benefits of being in the dominant demographic and cultural group is that you are allowed to simply be a person, a blank slate upon which you can write your own individual story. You have no baggage but what you choose
  • The power and privilege that come along with that — being the base model, a person with no asterisk — are invisible to many white men. Simply calling them “white people,” much less questioning the behavior or beliefs of white people, drags that power and privilege into the open
  • ...3 more annotations...
  • No one else gets to pretend their politics are free of identity. White people do. But simply saying the words “white people” is a direct attack on that illusion. It identifies, i.e., creates (or rather, exposes) an identity, a group with shared characteristics and interests. It raises questions (and doubts) about the group’s standing and power relative to other groups. It illuminates all that hidden baggage. Lots of white people really hate that
  • it’s difficult to think of a US setting in which the words “white people” are received neutrally. The term is always charged somehow, freighted with meaning and potential conflict, vaguely subversive
  • As many have pointed out and this political era has made painfully clear, to a dominant demographic, the loss of privilege feels like persecution. Being just one group among many feels like losing. After all, what good is being white in the US, especially among poor whites, if some third-generation Ugandan immigrant has just as much control over their fate as they have over hers? If a poll asks whether they’re any good for her, rather than the other way around?
Ed Webb

GCHQ revelations: mastery of the internet will mean mastery of everyone | Henry Porter ... - 0 views

  • We are fond of saying that the younger generation doesn't know the meaning of the word privacy, but what you give away voluntarily and what the state takes are as different as charity and tax. Privacy is the defining quality of a free people. Snowden's compelling leaks show us that mastery of the internet will ineluctably mean mastery over the individual.
Ed Webb

Would You Protect Your Computer's Feelings? Clifford Nass Says Yes. - ProfHacker - The ... - 0 views

  • The Man Who Lied to His Laptop condenses for a popular audience an argument that Nass has been making for at least 15 years: humans do not differentiate between computers and people in their social interactions.
  • At first blush, this sounds absurd. Everyone knows that it's "just a computer," and of course computers don't have feelings. And yet. Nass has a slew of amusing stories—and, crucially, studies based on those stories—indicating that, no matter what "everyone knows," people act as if the computer secretly cares. For example: In one study, users reviewed a software package, either on the same computer they'd used it on, or on a different computer. Consistently, participants gave the software better ratings when they reviewed in on the same computer—as if they didn't want the computer to feel bad. What's more, Nass notes, "every one of the participants insisted that she or he would never bother being polite to a computer" (7).
  • Nass found that users given completely random praise by a computer program liked it more than the same program without praise, even though they knew in advance the praise was meaningless. In fact, they liked it as much as the same program, if they were told the praise was accurate. (In other words, flattery was as well received as praise, and both were preferred to no positive comments.) Again, when questioned about the results, users angrily denied any difference at all in their reactions.
  •  
    How do you interact with the computing devices in your life?
Ed Webb

Stephen Downes: A World to Change - 0 views

  • we need, first, to take charge of our own learning, and next, help others take charge of their own learning. We need to move beyond the idea that an education is something that is provided for us, and toward the idea that an education is something that we create for ourselves. It is time, in other words, that we change out attitude toward learning and the educational system in general. That is not to advocate throwing learners off the bus to fend for themselves. It is hard to be self-reliant, to take charge of one's own learning, and people shouldn't have to do it alone. It is instead to articulate a way we as a society approach education and learning, beginning with an attitude, though the development of supports and a system, through to the techniques and technologies that support that.
  •  
    For those interested in blogging further about education, more food for thought
Ed Webb

Our Digitally Undying Memories - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • as Viktor Mayer-Schönberger argues convincingly in his book Delete: The Virtue of Forgetting in the Digital Age (Princeton University Press, 2009), the costs of such powerful collective memory are often higher than we assume.
  • "Total recall" renders context, time, and distance irrelevant. Something that happened 40 years ago—whether youthful or scholarly indiscretion—still matters and can come back to harm us as if it had happened yesterday.
  • an important "third wave" of work about the digital environment. In the late 1990s and early 2000s, we saw books like Nicholas Negroponte's Being Digital (Knopf, 1995) and Howard Rhein-gold's The Virtual Community: Homesteading on the Electronic Frontier (Addison-Wesley, 1993) and Smart Mobs: The Next Social Revolution (Perseus, 2002), which idealistically described the transformative powers of digital networks. Then we saw shallow blowback, exemplified by Susan Jacoby's The Age of American Unreason (Pantheon, 2008).
  • ...14 more annotations...
  • For most of human history, forgetting was the default and remembering the challenge.
  • Chants, songs, monasteries, books, libraries, and even universities were established primarily to overcome our propensity to forget over time. The physical and economic limitations of all of those technologies and institutions served us well. Each acted not just as memory aids but also as filters or editors. They helped us remember much by helping us discard even more.
    • Ed Webb
       
      Excellent point, well made.
  • Our use of the proliferating data and rudimentary filters in our lives renders us incapable of judging, discriminating, or engaging in deductive reasoning. And inductive reasoning, which one could argue is entering a golden age with the rise of huge databases and the processing power needed to detect patterns and anomalies, is beyond the reach of lay users of the grand collective database called the Internet.
  • Even 10 years ago, we did not consider that words written for a tiny audience could reach beyond, perhaps to someone unforgiving, uninitiated in a community, or just plain unkind.
  • Remembering to forget, as Elvis argued, is also essential to getting over heartbreak. And, as Jorge Luis Borges wrote in his 1942 (yep, I Googled it to find the date) story "Funes el memorioso," it is just as important to the act of thinking. Funes, the young man in the story afflicted with an inability to forget anything, can't make sense of it. He can't think abstractly. He can't judge facts by relative weight or seriousness. He is lost in the details. Painfully, Funes cannot rest.
  • Just because we have the vessels, we fill them.
  • the default habits of our species: to record, retain, and release as much information as possible
  • Perhaps we just have to learn to manage wisely how we digest, discuss, and publicly assess the huge archive we are building. We must engender cultural habits that ensure perspective, calm deliberation, and wisdom. That's hard work.
  • we choose the nature of technologies. They don't choose us. We just happen to choose unwisely with some frequency
  • surveillance as the chief function of electronic government
  • critical information studies
  • Siva Vaidhyanathan is an associate professor of media studies and law at the University of Virginia. His next book, The Googlization of Everything, is forthcoming from the University of California Press.
  • Nietzsche's _On the Use and Disadvantage of History for Life_
  • Google compresses, if not eliminates, temporal context. This is likely only to exacerbate the existing problem in politics of taking one's statements out of context. A politician whose views on a subject have evolved quite logically over decades in light of changing knowledge and/or circumstances is held up in attack ads as a flip-flopper because consecutive Google entries have him/her saying two opposite things about the same subject -- and never mind that between the two statements, the Berlin Wall may have fallen or the economy crashed harder than at any other time since 1929.
Ed Webb

Goodbye petabytes, hello zettabytes | Technology | The Guardian - 0 views

  • Every man, woman and child on the planet using micro-blogging site Twitter for a century. For many people that may sound like a vision of hell, but for watchers of the tremendous growth of digital communications it is a neat way of presenting the sheer scale of the so-called digital universe.
  • the growing desire of corporations and governments to know and store ever more data about everyone
  • experts estimate that all human language used since the dawn of time would take up about 5,000 petabytes if stored in digital form, which is less than 1% of the digital content created since someone first switched on a computer.
  • ...6 more annotations...
  • A zettabyte, incidentally, is roughly half a million times the entire collections of all the academic libraries in the United States.
  • Mobile phones have dramatically widened the range of people who can create, store and share digital information."China now has more visible devices out on the streets being used by individuals than the US does," said McDonald. "We are seeing the democratisation and commoditisation of the use and creation of information."
  • About 70% of the digital universe is generated by individuals, but its storage is then predominantly the job of corporations. From emails and blogs to mobile phone calls, it is corporations that are storing information on behalf of consumers.
  • actions in the offline world that individuals carry out which result in digital content being created by organisations – from cashpoint transactions which a bank must record to walking along the pavement, which is likely to result in CCTV footage
  • "unstructured"
  • "You talk to a kid these days and they have no idea what a kilobyte is. The speed things progress, we are going to need many words beyond zettabyte."
Ed Webb

Programmed for Love: The Unsettling Future of Robotics - The Chronicle Review - The Chr... - 0 views

  • Her prediction: Companies will soon sell robots designed to baby-sit children, replace workers in nursing homes, and serve as companions for people with disabilities. All of which to Turkle is demeaning, "transgressive," and damaging to our collective sense of humanity. It's not that she's against robots as helpers—building cars, vacuuming floors, and helping to bathe the sick are one thing. She's concerned about robots that want to be buddies, implicitly promising an emotional connection they can never deliver.
  • y: We are already cyborgs, reliant on digital devices in ways that many of us could not have imagined just a few years ago
  • "We are hard-wired that if something meets extremely primitive standards, either eye contact or recognition or very primitive mutual signaling, to accept it as an Other because as animals that's how we're hard-wired—to recognize other creatures out there."
  • ...4 more annotations...
  • "Can a broken robot break a child?" they asked. "We would not consider the ethics of having children play with a damaged copy of Microsoft Word or a torn Raggedy Ann doll. But sociable robots provoke enough emotion to make this ethical question feel very real."
  • "The concept of robots as baby sitters is, intellectually, one that ought to appeal to parents more than the idea of having a teenager or similarly inexperienced baby sitter responsible for the safety of their infants," he writes. "Their smoke-detection capabilities will be better than ours, and they will never be distracted for the brief moment it can take an infant to do itself some terrible damage or be snatched by a deranged stranger."
  • "What if we get used to relationships that are made to measure?" Turkle asks. "Is that teaching us that relationships can be just the way we want them?" After all, if a robotic partner were to become annoying, we could just switch it off.
  • We've reached a moment, she says, when we should make "corrections"—to develop social norms to help offset the feeling that we must check for messages even when that means ignoring the people around us. "Today's young people have a special vulnerability: Although always connected, they feel deprived of attention," she writes. "Some, as children, were pushed on swings while their parents spoke on cellphones. Now these same parents do their e-mail at the dinner table." One 17-year-old boy even told her that at least a robot would remember everything he said, contrary to his father, who often tapped at a BlackBerry during conversations.
Ed Webb

Writing in College - 1. Some crucial differences between high school and college writing - 0 views

  • some students ask why they should be required to convince anyone of anything. "After all," they say, "we are all entitled to our opinions, so all we should have to do is express them clearly. Here's my opinion. Take it or leave it." This point of view both misunderstands the nature of argument and ignores its greatest value
    • Ed Webb
       
      Key!
  • In an Age of Information, what most professionals do is research, think, and make arguments. (And part of the value of doing your own thinking and writing is that it makes you much better at evaluating the thinking and writing of others.)
    • Ed Webb
       
      Both parts of this - both sentences - are important.
  • Words such as "show how" and "explain" and "illustrate" do not ask you to summarize a reading. They ask you to show how the reading is put together, how it works
  • ...3 more annotations...
  • A third kind of assignment is simultaneously least restrictive and most intimidating. These assignments leave it up to you to decide not only what you will claim but what you will write about and even what kind of analysis you will do: "Analyze the role of a character in The Odyssey." That is the kind of assignment that causes many students anxiety because they must motivate their research almost entirely on their own. To meet this kind of assignment, the best advice we can give is to read with your mind open to things that puzzle you, that make you wish you understood something better. Now that advice may seem almost counterproductive; you may even think that being puzzled or not understanding something testifies to your intellectual failure. Yet almost everything we do in a university starts with someone being puzzled about something, someone with a vague--or specific--dissatisfaction caused by not knowing something that seems important or by wanting to understand something better. The best place to begin thinking about any assignment is with what you don't understand but wish you did.
  • If after all this analysis of the assignment you are still uncertain about what is expected of you, ask your instructor. If your class has a Writing Intern, ask that person. If for some reason you can't ask either, locate the Academic Tutor in your residence hall and ask that person. Do this as soon as possible.
  • you will only rarely be able state good points like these before you write your first draft. Much more often, you discover good points at the end of the process of drafting. Writing is a way of thinking through a problem, of discovering what you want to say. So do not feel that you should begin to write only when you have a fully articulated point in mind. Instead, write to discover and to refine it
Ed Webb

"We will need writers who can remember freedom": Ursula K Le Guin at the National Book ... - 0 views

  • I think hard times are coming when we will be wanting the voices of writers who can see alternatives to how we live now and can see through our fear-stricken society and its obsessive technologies to other ways of being, and even imagine some real grounds for hope. We will need writers who can remember freedom. Poets, visionaries—the realists of a larger reality.
  • We live in capitalism. Its power seems inescapable. So did the divine right of kings. Any human power can be resisted and changed by human beings. Resistance and change often begin in art, and very often in our art—the art of words.
Ed Webb

We, The Technocrats - blprnt - Medium - 2 views

  • Silicon Valley’s go-to linguistic dodge: the collective we
  • “What kind of a world do we want to live in?”
  • Big tech’s collective we is its ‘all lives matter’, a way to soft-pedal concerns about privacy while refusing to speak directly to dangerous inequalities.
  • ...7 more annotations...
  • One two-letter word cannot possibly hold all of the varied experiences of data, specifically those of the people are at the most immediate risk: visible minorities, LGBTQ+ people, indigenous communities, the elderly, the disabled, displaced migrants, the incarcerated
  • At least twenty-six states allow the FBI to perform facial recognition searches against their databases of images from drivers licenses and state IDs, despite the fact that the FBI’s own reports have indicated that facial recognition is less accurate for black people. Black people, already at a higher risk of arrest and incarceration than other Americans, feel these data systems in a much different way than I do
  • last week, the Department of Justice passed a brief to the Supreme Court arguing that sex discrimination protections do not extend to transgender people. If this ruling were to be supported, it would immediately put trans women and men at more risk than others from the surveillant data technologies that are becoming more and more common in the workplace. Trans people will be put in distinct danger — a reality that is lost when they are folded neatly into a communal we
  • I looked at the list of speakers for the conference in Brussels to get an idea of the particular we of Cook’s audience, which included Mark Zuckerberg, Google’s CEO Sundar Pichai and the King of Spain. Of the presenters, 57% were men and 83% where white. Only 4 of the 132 people on stage were black.
  • another we that Tim Cook necessarily speaks on the behalf of: privileged men in tech. This we includes Mark and Sundar; it includes 60% of Silicon Valley and 91% of its equity. It is this we who have reaped the most benefit from Big Data and carried the least risk, all while occupying the most time on stage
  • Here’s a more urgent question for us, one that doesn’t ask what we want but instead what they need:How can this new data world be made safer for the people who are facing real risks, right now?
  • “The act of listening has greater ethical potential than speaking” — Julietta Singh
Ed Webb

Can Economists and Humanists Ever Be Friends? | The New Yorker - 0 views

  • There is something thrilling about the intellectual audacity of thinking that you can explain ninety per cent of behavior in a society with one mental tool.
  • education, which they believe is a form of domestication
  • there is no moral dimension to this economic analysis: utility is a fundamentally amoral concept
  • ...11 more annotations...
  • intellectual overextension is often found in economics, as Gary Saul Morson and Morton Schapiro explain in their wonderful book “Cents and Sensibility: What Economics Can Learn from the Humanities” (Princeton). Morson and Schapiro—one a literary scholar and the other an economist—draw on the distinction between hedgehogs and foxes made by Isaiah Berlin in a famous essay from the nineteen-fifties, invoking an ancient Greek fragment: “The fox knows many things, but the hedgehog one big thing.” Economists tend to be hedgehogs, forever on the search for a single, unifying explanation of complex phenomena. They love to look at a huge, complicated mass of human behavior and reduce it to an equation: the supply-and-demand curves; the Phillips curve, which links unemployment and inflation; or mb=mc, which links a marginal benefit to a marginal cost—meaning that the fourth slice of pizza is worth less to you than the first. These are powerful tools, which can be taken too far. Morson and Schapiro cite the example of Gary Becker, the Nobel laureate in economics in 1992. Becker is a hero to many in the field, but, for all the originality of his thinking, to outsiders he can stand for intellectual overconfidence. He thought that “the economic approach is a comprehensive one that is applicable to all human behavior.” Not some, not most—all
  • Becker analyzed, in his own words, “fertility, education, the uses of time, crime, marriage, social interactions, and other ‘sociological,’ ‘legal,’ and ‘political problems,’ ” before concluding that economics explained everything
  • The issue here is one of overreach: taking an argument that has worthwhile applications and extending it further than it usefully goes. Our motives are often not what they seem: true. This explains everything: not true. After all, it’s not as if the idea that we send signals about ourselves were news; you could argue that there is an entire social science, sociology, dedicated to the subject. Classic practitioners of that discipline study the signals we send and show how they are interpreted by those around us, as in Erving Goffman’s “The Presentation of Self in Everyday Life,” or how we construct an entire identity, both internally and externally, from the things we choose to be seen liking—the argument of Pierre Bourdieu’s masterpiece “Distinction.” These are rich and complicated texts, which show how rich and complicated human difference can be. The focus on signalling and unconscious motives in “The Elephant in the Brain,” however, goes the other way: it reduces complex, diverse behavior to simple rules.
  • “A traditional cost-benefit analysis could easily have led to the discontinuation of a project widely viewed as being among the most successful health interventions in African history.”
  • Another part of me, though, is done with it, with the imperialist ambitions of economics and its tendency to explain away differences, to ignore culture, to exalt reductionism. I want to believe Morson and Schapiro and Desai when they posit that the gap between economics and the humanities can be bridged, but my experience in both writing fiction and studying economics leads me to think that they’re wrong. The hedgehog doesn’t want to learn from the fox. The realist novel is a solemn enemy of equations. The project of reducing behavior to laws and the project of attending to human beings in all their complexity and specifics are diametrically opposed. Perhaps I’m only talking about myself, and this is merely an autobiographical reflection, rather than a general truth, but I think that if I committed any further to economics I would have to give up writing fiction. I told an economist I know about this, and he laughed. He said, “Sounds like you’re maximizing your utility.” 
  • finance is full of “attribution errors,” in which people view their successes as deserved and their failures as bad luck. Desai notes that in business, law, or pedagogy we can gauge success only after months or years; in finance, you can be graded hour by hour, day by day, and by plainly quantifiable measures. What’s more, he says, “the ‘discipline of the market’ shrouds all of finance in a meritocratic haze.” And so people who succeed in finance “are susceptible to developing massively outsized egos and appetites.”
  • one of the things I liked about economics, finance, and the language of money was their lack of hypocrisy. Modern life is full of cant, of people saying things they don’t quite believe. The money guys, in private, don’t go in for cant. They’re more like Mafia bosses. I have to admit that part of me resonates to that coldness.
  • Economics, Morson and Schapiro say, has three systematic biases: it ignores the role of culture, it ignores the fact that “to understand people one must tell stories about them,” and it constantly touches on ethical questions beyond its ken. Culture, stories, and ethics are things that can’t be reduced to equations, and economics accordingly has difficulty with them
  • There is something thrilling about the intellectual audacity of thinking that you can explain ninety per cent of behavior in a society with one mental tool
  • According to Hanson and Simler, these unschooled workers “won’t show up for work reliably on time, or they have problematic superstitions, or they prefer to get job instructions via indirect hints instead of direct orders, or they won’t accept tasks and roles that conflict with their culturally assigned relative status with co-workers, or they won’t accept being told to do tasks differently than they had done them before.”
  • The idea that Maya Angelou’s career amounts to nothing more than a writer shaking her tail feathers to attract the attention of a dominant male is not just misleading; it’s actively embarrassing.
Ed Webb

Angry Optimism in a Drowned World: A Conversation with Kim Stanley Robinson | CCCB LAB - 0 views

  • The idea would be that not only do you have a multigenerational project of building a new world, but obviously the human civilization occupying it would also be new. And culturally and politically, it would be an achievement that would have no reason to stick with old forms from the history of Earth. It’s a multigenerational project, somewhat like building these cathedrals in Europe where no generation expects to end the job. By the time the job is near completion, the civilization operating it will be different to the one that began the project.
  • what the Mars scenario gave me – and gives all of humanity – is the idea that the physical substrate of the planet itself is also a part of the project, and it’s something that we are strong enough to influence. Not create, not completely control, not completely engineer because it’s too big and we don´t have that much ability to manipulate the large systems involved, nor the amount of power involved. But we do have enough to mess things up and we do have enough to finesse the system.This, I think, was a precursor to the idea of the Anthropocene. The Anthropocene is precisely the geological moment when humanity becomes a geological force, and it’s a science-fiction exercise to say that 50 million years from now, humanity’s descendants, or some other alien civilization, will be able to look at Earth and say: “This is when humanity began to impact things as much as volcanos or earthquakes.” So it’s a sci-fi story being told in contemporary culture as one way to define what we are doing now. So, that was what my Mars project was doing, and now we are in the Anthropocene as a mental space.
  • if humanity’s impact on the Earth is mostly negative in ecological terms, if you mark humanity’s impact as being so significant that we have produced a new geological age, then we have to think differently in our attitudes towards what we are doing with our biophysical substrate. And one of the things I think the Anthropocene brings up is that the Earth is our body, and we can finesse it, we can impact it, we can make ourselves sick.
  • ...32 more annotations...
  • The truth is that we are actually already at that moment of climate change and crisis. The political project that my novel discusses really ought to be enacted now, not 120 years from now. In the real world, what we’ve got is a necessity for our economic system to take damage to the ecosystem into account, and pay for that damage.
  • I worry that we’ve already swallowed the idea of the Anthropocene and stopped considering the importance of it; the profound shock that it should cause has already been diffused into just one more idea game that we play.
  • there is no question that, at times in the past, the Earth has been an ice ball with none of its water melted, and also a jungle planet with all of its water melted, and no ice on the planet whatsoever. And this is just from the natural extremes of planetary orbiting, and feedback loops of the atmosphere that we have naturally. But then what humanity is doing – and the reason you need the term “Anthropocene” – is pushing us into zones that the planet maybe has been in the past, but never with this extraordinary speed. Things that would have taken three, four, five million years in the past, or even longer, a 50-million-year process, are being done in fifty years, a million times faster
  • The market doesn’t have a brain, a conscience, a morality or a sense of history. The market only has one rule and it’s a bad rule, a rule that would only work in a world where there was an infinity of raw materials, what the eco-Marxists are calling the “four cheaps”: cheap food, cheap power, cheap labour, cheap raw material
  • this isn’t the way capitalism works, as currently configured; this isn’t profitable. The market doesn’t like it. By the market I mean – what I think everybody means, but doesn’t admit – capital, accumulated capital, and where it wants to put itself next. And where it wants to put itself next is at the highest rate of return, so that if it’s a 7% return to invest in vacation homes on the coast of Spain, and it’s only a 6% rate of return to build a new clean power plant out in the empty highlands of Spain, the available capital of this planet will send that money and investment and human work into vacation homes on the coast of Spain rather than the power plants
  • If Spain were to do a certain amount for its country, but was sacrificing relative to international capital or to other countries, then it would be losing the battle for competitive advantage in the capitalist system
  • Nobody can afford to volunteer to be extra virtuous in a system where the only rule is quarterly profit and shareholder value. Where the market rules, all of us are fighting for the crumbs to get the best investment for the market.
  • the market is like a blind giant driving us off a cliff into destruction
  • we need postcapitalism
  • I look to the next generation, to people who are coming into their own intellectual power and into political and economic power, to be the most productive citizens, at the start of their careers, to change the whole story. But, sometimes it just strikes me as astonishing, how early on we are in our comprehension of this system
  • design is a strange amalgam, like a science-fictional cyborg between art and engineering, planning, building, and doing things in the real world
  • you can´t have permanent growth.
  • The Anthropocene is that moment in which capitalist expansion can no longer expand, and you get a crush of the biophysical system – that’s climate change – and then you get a crush of the political economy because, if you’ve got a system that demands permanent growth, capital accumulation and profit and you can’t do it anymore, you get a crisis that can’t be solved by the next expansion
  • If the Anthropocene is a crisis, an end of the road for capitalism, well, what is post-capitalism? This I find painfully under-discussed and under-theorized. As a Sci-Fi writer, an English major, a storyteller – not a theorist nor a political economist – looking for help, looking for theories and speculations as to what will come next and how it will work, and finding a near emptiness.
  • here is the aporia, as they call it: the non-seeing that is in human culture today. This is another aspect of the Anthropocene
  • Economics is the quantitative and systematic analysis of capitalism itself. Economics doesn’t do speculative or projective economics; perhaps it should, I mean, I would love it if it did, but it doesn’t
  • If the rules of that global economy were good, there could not be bad actors because if the G20,  95% of the economy, were all abiding by good rules, there would be nowhere for greedy actors to escape to, to enact their greed.
  • You can see the shapes of a solution. This is very important for anybody that wants to have hope or everybody that is realizing that there will be humans after us, the generations to come. It’s strange because they are absent; they are going to be here, they are going to be our descendants and they are even going to have our DNA in them. They will be versions of us but because they are not here now, it’s very easy to dismiss their concerns.
  • capitalist economics discounts their concerns, in the technical term of what is called in economics “the discount rate”. So, a high discount rate in your economic calculations of value — like amortized payments or borrowing from the future – says: “The future isn’t important to us, they will take care of themselves” and a low discount rate says: “We are going to account for the future, we think the future matters, the people yet to come matter.” That choice of a discount rate is entirely an ethical and political decision; it’s not a technical or scientific decision except for, perhaps, the technical suggestion that if you want your children to survive you’d better choose a lower discount rate. But that “if” is kind of a moral, an imaginative statement, and less practical in the long-term view.
  • I have been talking about these issues for about fifteen years and, ten years ago, to suggest that the Paris Agreement would be signed, people would say: “but that will never happen!” As a utopian science-fiction writer, it was a beautiful moment.
  • As a Science-Fiction writer, what is in your view the responsibility that the arts, literature and literary fiction can have in helping to articulate possible futures? It seems that imagining other forms of living is key to producing them, to make them actionable.
  • The sciences are maybe the dominant cultural voice in finding out what’s going on in the world and how things work, and the technicalities about how and why things work. But how that feels, the emotional impact in it, which is so crucial to the human mind and human life in general, these are what the arts provide
  • The way that we create energy and the way that we move around on this planet both have to be de-carbonized. That has to be, if not profitable, affordable
  • This is what bothers me in economics; its blind adherence to the capitalist moment even when it is so destructive. Enormous amounts of intellectual energy are going into the pseudo-quantitative legal analysis of an already-existing system that’s destructive. Well, this is not good enough anymore because it’s wrecking the biophysical infrastructure
  • What would that new way of living be? The economists are not going to think of it. The artists are often not specific enough in their technical and physical detail, so they can become fantasy novelists rather than science-fiction novelists; there is too much a possibility in the arts, and I know very well myself, of having a fantasy response, a wish fulfilment. But when you’re doing architecture you think: “Well, I need ten million dollars, I need this land, I need to entrain the lives of five hundred people for ten years of their careers in order to make something that then will be good for the future generations to use.”
  • After the 2008 crash of the world economy, the neoliberal regime began to look a bit more fragile and brutal, less massive and immovable. I see things very differently, the world reacting very differently since the 2008 crash to how it did before it. There was this blind faith that capitalism worked, and also even if it didn’t work it wasn’t changeable, it was too massive to change. Now what I am pointing out comes from the radical economists coming out of political economy, anthropology and leftist politics saying that international finance is simply overleveraged and therefore is extremely fragile and open to being taken down. Because it depends on everybody paying their bills and fulfilling their contracts.
  • Human extinction, this is bullshit. Humans will scratch around and find some refuge. You could imagine horrible disasters and reductions of human population but extinction is not the issue for humans, it’s for everybody else. All of our horizontal brothers and sisters, the other big mammals, are in terrible trouble from our behaviour
  • I actually am offended at this focus on the human; “Oh, we’ll be in trouble,”: big deal. We deserve to be in trouble, we created the trouble. The extinctions of the other big mammals: the tigers, rhinoceroses, all big mammals that aren’t domestic creatures of our own built in factories, are in terrible trouble. So, the human effort ought to be towards avoiding extinctions of other creatures. Never waste a worry for humanity itself, which, no matter what, won’t become extinct. Ten centuries from now, humanity will be doing something and that something is likely to be more sustainable and interesting than what we are doing now. The question for us is. “How do you get there?” But ten centuries from now, there might not be any tigers.
  • There’s an Antonio Gramsci idea you have used to explain your position: “pessimism of the intellect, optimism of the will.” Your optimism is a moral and political position, it’s not just hoping for the best. Why do you think we need to defend optimism
  • Use the optimism as a club, to beat the crap out of people who are saying that we are doomed, who are saying let’s give up now. And this “let’s give up now” can be very elaborated academically. You can say: “Well, I’m just into adaptation rather than mitigation, there’s nothing we can do about climate change, all you can do is adapt to it.” In other words, stick with capitalism, stick with the market, and don’t get freaked out. Just adapt and get your tenure because it is usually academics who say it, and they’re not usually in design or architecture, they aren’t really doing things. They’re usually in philosophy or in theory. They come out of my departments, they’re telling a particular story and I don’t like that story. My story is: the optimism that I’m trying to express is that there won’t be an apocalypse, there will be a disaster. But after the disaster comes the next world on.
  • there’s a sort of apocalyptic end-of-the-world “ism” that says that I don’t have to change my behaviour, I don’t have to try because it’s already doomed
  • Maybe optimism is a kind of moral imperative, you have to stay optimistic because otherwise you’re just a wanker that’s taken off into your own private Idaho of “Oh well, things are bad.” It’s so easy to be cynical; it’s so easy to be pessimistic
1 - 20 of 29 Next ›
Showing 20 items per page