Skip to main content

Home/ Dystopias/ Group items tagged reading

Rss Feed Group items tagged

Ed Webb

How to Mark a Book - 0 views

  • A book is more like the score of a piece of music than it is like a painting. No great musician confuses a symphony with the printed sheets of music. Arturo Toscanini reveres Brahms, but Toscanini's score of the G minor Symphony is so thoroughly marked up that no one but the maestro himself can read it. The reason why a great conductor makes notations on his musical scores -- marks them up again and again each time he returns to study them--is the reason why you should mark your books.
    • Ed Webb
       
      This is an excellent analogy.
  • the physical act of writing, with your own hand, brings words and sentences more sharply before your mind and preserves them better in your memory. To set down your reaction to important words and sentences you have read, and the questions they have raised in your mind, is to preserve those reactions and sharpen those questions.
    • Ed Webb
       
      The effect of new technologies here is still imperfectly understood. But there is some evidence that typing notes is less efficacious than handwriting them, in terms of inscribing information to memory and developing thought.
  • that is exactly what reading a book should be: a conversation between you and the author. Presumably he knows more about the subject than you do; naturally, you'll have the proper humility as you approach him. But don't let anybody tell you that a reader is supposed to be solely on the receiving end. Understanding is a two-way operation; learning doesn't consist in being an empty receptacle. The learner has to question himself and question the teacher. He even has to argue with the teacher, once he understands what the teacher is saying. And marking a book is literally an expression of differences, or agreements of opinion, with the author
  • ...4 more annotations...
  • Underlining (or highlighting): of major points, of important or forceful statements. Vertical lines at the margin: to emphasize a statement already underlined. Star, asterisk, or other doo-dad at the margin: to be used sparingly, to emphasize the ten or twenty most important statements in the book. (You may want to fold the bottom comer of each page on which you use such marks. It won't hurt the sturdy paper on which most modern books are printed, and you will be able take the book off the shelf at any time and, by opening it at the folded-corner page, refresh your recollection of the book.) Numbers in the margin: to indicate the sequence of points the author makes in developing a single argument. Numbers of other pages in the margin: to indicate where else in the book the author made points relevant to the point marked; to tie up the ideas in a book, which, though they may be separated by many pages, belong together. Circling or highlighting of key words or phrases. Writing in the margin, or at the top or bottom of the page, for the sake of: recording questions (and perhaps answers) which a passage raised in your mind; reducing a complicated discussion to a simple statement; recording the sequence of major points right through the books. I use the end-papers at the back of the book to make a personal index of the author's points in the order of their appearance.
    • Ed Webb
       
      This is a good schema. You can develop your own that accomplishes the same. The key is to have a schema and apply it consistently.
  • you may say that this business of marking books is going to slow up your reading. It probably will. That's one of the reasons for doing it
  • Some things should be read quickly and effortlessly and some should be read slowly and even laboriously.
  • Why is marking up a book indispensable to reading? First, it keeps you awake. (And I don't mean merely conscious; I mean awake.) In the second place; reading, if it is active, is thinking, and thinking tends to express itself in words, spoken or written. The marked book is usually the thought-through book. Finally, writing helps you remember the thoughts you had, or the thoughts the author expressed.
Ed Webb

The Imaginative Reality of Ursula K. Le Guin | VQR Online - 1 views

  • The founders of this anarchist society made up a new language because they realized you couldn’t have a new society and an old language. They based the new language on the old one but changed it enormously. It’s simply an illustration of what Orwell was saying in his great essay about how writing English clearly is a political matter.
    • Ed Webb
       
      Le Guin, of course, admires "Politics and the English Language." Real-world examples of people changing languages to change society include the invention of modern Turkish and modern Hebrew.
  • There are advantages and disadvantages to living a very long time, as I have. One of the advantages is that you can’t help having a long view. You’ve seen it come and seen it go. Something that’s being announced as the absolute only way to write, you recognize as a fashion, a fad, trendy—the way to write right now if you want to sell right now to a right now editor. But there’s also the long run to consider. Nothing’s deader than last year’s trend. 
  • Obviously, the present tense has certain uses that it’s wonderfully suited for. But recently it has been adopted blindly, as the only way to tell a story—often by young writers who haven’t read very much. Well, it’s a good way to tell some stories, not a good way to tell others. It’s inherently limiting. I call it “flashlight focus.” You see a spot ahead of you and it is dark all around it. That’s great for high suspense, high drama, cut-to-the-chase writing. But if you want to tell a big, long story, like the books of Elena Ferrante, or Jane Smiley’s The Last Hundred Years trilogy, which moves year by year from 1920 to 2020—the present tense would cripple those books. To assume that the present tense is literally “now” and the past tense literally remote in time is extremely naïve. 
  • ...9 more annotations...
  • Henry James did the limited third person really well, showing us the way to do it. He milked that cow successfully. And it’s a great cow, it still gives lots of milk. But if you read only contemporary stuff, always third-person limited, you don’t realize that point of view in a story is very important and can be very movable. It’s here where I suggest that people read books like Woolf’s To the Lighthouse to see what she does by moving from mind to mind. Or Tolstoy’s War and Peace for goodness’ sake. Wow. The way he slides from one point of view to another without you knowing that you’ve changed point of view—he does it so gracefully. You know where you are, whose eyes you are seeing through, but you don’t have the sense of being jerked from place to place. That’s mastery of a craft.
  • Any of us who grew up reading eighteenth- or nineteenth-century fiction are perfectly at home with what is called “omniscience.” I myself call it “authorial” point of view because the term “omnisicence,” the idea of an author being omniscient, is so often used in a judgmental way, as if it were a bad thing. But the author, after all, is the author of all these characters, the maker, the inventor of them. In fact all the characters are the author if you come right down to the honest truth of it. So the author has the perfect right to know what they’re thinking. If the author doesn’t tell you what they are thinking … why? This is worth thinking about. Often it’s simply to spin out suspense by not telling you what the author knows. Well, that’s legitimate. This is art. But I’m trying to get people to think about their choices here, because there are so many beautiful choices that are going unused. In a way, first person and limited third are the easiest ones, the least interesting. 
  • to preach that story is conflict, always to ask, “Where’s the conflict in your story?”—this needs some thinking about. If you say that story is about conflict, that plot must be based on conflict, you’re limiting your view of the world severely. And in a sense making a political statement: that life is conflict, so in stories conflict is all that really matters. This is simply untrue. To see life as a battle is a narrow, social-Darwinist view, and a very masculine one. Conflict, of course, is part of life, I’m not saying you should try to keep it out of your stories, just that it’s not their only lifeblood. Stories are about a lot of different things
  • The first decade of her career, beginning in the sixties, included some of her most well-known works of fiction: A Wizard of Earthsea, The Left Hand of Darkness, The Dispossessed, and The Lathe of Heaven. Each of these works imagined not just worlds, but homes, homes that became real for her readers, homes where protagonists were women, people of color, gender fluid, anticapitalist—imaginary homes that did not simply spin out our worst dystopic fears for the future like so many of the apocalyptic novels of today, but also modeled other ways of being, other ways to create home.
  • “Children know perfectly well that unicorns aren’t real,” Le Guin once said. “But they also know that books about unicorns, if they are good books, are true books.”
  • “Fake rules” and “alternative facts” are used in our time not to increase moral understanding and social possibility but to increase power for those who already have it. A war on language has unhinged words from their meaning, language from its capacity as truth-teller. But perhaps, counterintuitively, it is in the realm of the imagination, the fictive, where we can best re-ground ourselves in the real and the true.
  • you can’t find your own voice if you aren’t listening for it. The sound of your writing is an essential part of what it’s doing. Our teaching of writing tends to ignore it, except maybe in poetry. And so we get prose that goes clunk, clunk, clunk. And we don’t know what’s wrong with it
  • You emphasize the importance of understanding grammar and grammar terminology but also the importance of interrogating its rules. You point out that it is a strange phenomenon that grammar is the tool of our trade and yet so many writers steer away from an engagement with it. In my generation and for a while after—I was born in 1929—we were taught grammar right from the start. It was quietly drilled into us. We knew the names of the parts of speech, we had a working acquaintance with how English works, which they don’t get in most schools anymore. There is so much less reading in schools, and very little teaching of grammar. For a writer this is kind of like being thrown into a carpenter’s shop without ever having learned the names of the tools or handled them consciously. What do you do with a Phillips screwdriver? What is a Phillips screwdriver? We’re not equipping people to write; we’re just saying, “You too can write!” or “Anybody can write, just sit down and do it!” But to make anything, you’ve got to have the tools to make it.
  • In your book on writing, Steering the Craft, you say that morality and language are linked, but that morality and correctness are not the same thing. Yet we often confuse them in the realm of grammar. The “grammar bullies”—you read them in places like the New York Times—and they tell you what is correct: You must never use “hopefully.” “Hopefully, we will be going there on Tuesday.” That is incorrect and wrong and you are basically an ignorant pig if you say it. This is judgmentalism. The game that is being played there is a game of social class. It has nothing to do with the morality of writing and speaking and thinking clearly, which Orwell, for instance, talked about so well. It’s just affirming that I am from a higher class than you are. The trouble is that people who aren’t taught grammar very well in school fall for these statements from these pundits, delivered with vast authority from above. I’m fighting that. A very interesting case in point is using “they” as a singular. This offends the grammar bullies endlessly; it is wrong, wrong, wrong! Well, it was right until the eighteenth century, when they invented the rule that “he” includes “she.” It didn’t exist in English before then; Shakespeare used “they” instead of “he or she”—we all do, we always have done, in speaking, in colloquial English. It took the women’s movement to bring it back to English literature. And it is important. Because it’s a crossroads between correctness bullying and the moral use of language. If “he” includes “she” but “she” doesn’t include “he,” a big statement is being made, with huge social and moral implications. But we don’t have to use “he” that way—we’ve got “they.” Why not use it?
Ed Webb

Piper at the Gates of Hell: An Interview with Cyberpunk Legend John Shirley | Motherboard - 0 views

    • Ed Webb
       
      City Come A Walking is one of the most punk of the cyberpunk novels and short stories I have ever read, and I have read quite a few...
  • I'll press your buttons here by positing that if "we" (humankind) are too dumb to self-regulate our own childbirth output, too dim to recognize that we are polluting ourselves and neighbors out of sustainable existence, we are, in fact, a ridiculous parasite on this Earth and that the planet on which we live will simply slough us off—as it well should—and will bounce back without evidence of we even being here, come two or three thousand years. Your thoughts (in as much detail as you wish)?I would recommend reading my "the next 50 years" piece here. Basically I think that
 climate change, which in this case genuinely is caused mostly by humanity, 
is just one part of the environmental problem. Overfishing, toxification of 
the seas, pesticide use, weedkillers, prescription drugs in water,
 fracking, continued air pollution, toxicity in food, destruction of animal
 habitat, attrition on bee colonies—all this is converging. And we'll be 
facing the consequences for several hundred years.
  • I believe humanity will
 survive, and it won't be surviving like Road Warrior or the Morlocks from The Time Machine, but I think we'll have some cruelly ugly social consequences. We'll have famines the like of which we've never seen before, along with higher risk of wars—I do predict a third world war in the second half of this century but I don't think it will be a nuclear war—and I think we'll suffer so hugely we'll be forced to have a change in consciousness to adapt. 
  • ...1 more annotation...
  • We may end up having to "terraform" the Earth itself, to some extent.
Ed Webb

Writing in College - 1. Some crucial differences between high school and college writing - 0 views

  • some students ask why they should be required to convince anyone of anything. "After all," they say, "we are all entitled to our opinions, so all we should have to do is express them clearly. Here's my opinion. Take it or leave it." This point of view both misunderstands the nature of argument and ignores its greatest value
    • Ed Webb
       
      Key!
  • In an Age of Information, what most professionals do is research, think, and make arguments. (And part of the value of doing your own thinking and writing is that it makes you much better at evaluating the thinking and writing of others.)
    • Ed Webb
       
      Both parts of this - both sentences - are important.
  • Words such as "show how" and "explain" and "illustrate" do not ask you to summarize a reading. They ask you to show how the reading is put together, how it works
  • ...3 more annotations...
  • A third kind of assignment is simultaneously least restrictive and most intimidating. These assignments leave it up to you to decide not only what you will claim but what you will write about and even what kind of analysis you will do: "Analyze the role of a character in The Odyssey." That is the kind of assignment that causes many students anxiety because they must motivate their research almost entirely on their own. To meet this kind of assignment, the best advice we can give is to read with your mind open to things that puzzle you, that make you wish you understood something better. Now that advice may seem almost counterproductive; you may even think that being puzzled or not understanding something testifies to your intellectual failure. Yet almost everything we do in a university starts with someone being puzzled about something, someone with a vague--or specific--dissatisfaction caused by not knowing something that seems important or by wanting to understand something better. The best place to begin thinking about any assignment is with what you don't understand but wish you did.
  • If after all this analysis of the assignment you are still uncertain about what is expected of you, ask your instructor. If your class has a Writing Intern, ask that person. If for some reason you can't ask either, locate the Academic Tutor in your residence hall and ask that person. Do this as soon as possible.
  • you will only rarely be able state good points like these before you write your first draft. Much more often, you discover good points at the end of the process of drafting. Writing is a way of thinking through a problem, of discovering what you want to say. So do not feel that you should begin to write only when you have a fully articulated point in mind. Instead, write to discover and to refine it
Ed Webb

A woman first wrote the prescient ideas Huxley and Orwell made famous - Quartzy - 1 views

  • In 1919, a British writer named Rose Macaulay published What Not, a novel about a dystopian future—a brave new world if you will—where people are ranked by intelligence, the government mandates mind training for all citizens, and procreation is regulated by the state.You’ve probably never heard of Macaulay or What Not. However, Aldous Huxley, author of the science fiction classic Brave New World, hung out in the same London literary circles as her and his 1932 book contains many concepts that Macaulay first introduced in her work. In 2019, you’ll be able to read Macaulay’s book yourself and compare the texts as the British publisher Handheld Press is planning to re- release the forgotten novel in March. It’s been out of print since the year it was first released.
  • The resurfacing of What Not also makes this a prime time to consider another work that influenced Huxley’s Brave New World, the 1923 novel We by Yvgeny Zamyatin. What Not and We are lost classics about a future that foreshadows our present. Notably, they are also hidden influences on some of the most significant works of 20th century fiction, Brave New World and George Orwell’s 1984.
  • In Macaulay’s book—which is a hoot and well worth reading—a democratically elected British government has been replaced with a “United Council, five minds with but a single thought—if that,” as she put it. Huxley’s Brave New World is run by a similarly small group of elites known as “World Controllers.”
  • ...12 more annotations...
  • citizens of What Not are ranked based on their intelligence from A to C3 and can’t marry or procreate with someone of the same rank to ensure that intelligence is evenly distributed
  • Brave New World is more futuristic and preoccupied with technology than What Not. In Huxley’s world, procreation and education have become completely mechanized and emotions are strictly regulated pharmaceutically. Macaulay’s Britain is just the beginning of this process, and its characters are not yet completely indoctrinated into the new ways of the state—they resist it intellectually and question its endeavors, like the newly-passed Mental Progress Act. She writes:He did not like all this interfering, socialist what-not, which was both upsetting the domestic arrangements of his tenants and trying to put into their heads more learning than was suitable for them to have. For his part he thought every man had a right to be a fool if he chose, yes, and to marry another fool, and to bring up a family of fools too.
  • Where Huxley pairs dumb but pretty and “pneumatic” ladies with intelligent gentlemen, Macaulay’s work is decidedly less sexist.
  • We was published in French, Dutch, and German. An English version was printed and sold only in the US. When Orwell wrote about We in 1946, it was only because he’d managed to borrow a hard-to-find French translation.
  • While Orwell never indicated that he read Macaulay, he shares her subversive and subtle linguistic skills and satirical sense. His protagonist, Winston—like Kitty—works for the government in its Ministry of Truth, or Minitrue in Newspeak, where he rewrites historical records to support whatever Big Brother currently says is good for the regime. Macaulay would no doubt have approved of Orwell’s wit. And his state ministries bear a striking similarity to those she wrote about in What Not.
  • Orwell was familiar with Huxley’s novel and gave it much thought before writing his own blockbuster. Indeed, in 1946, before the release of 1984, he wrote a review of Zamyatin’s We (pdf), comparing the Russian novel with Huxley’s book. Orwell declared Huxley’s text derivative, writing in his review of We in The Tribune:The first thing anyone would notice about We is the fact—never pointed out, I believe—that Aldous Huxley’s Brave New World must be partly derived from it. Both books deal with the rebellion of the primitive human spirit against a rationalised, mechanized, painless world, and both stories are supposed to take place about six hundred years hence. The atmosphere of the two books is similar, and it is roughly speaking the same kind of society that is being described, though Huxley’s book shows less political awareness and is more influenced by recent biological and psychological theories.
  • In We, the story is told by D-503, a male engineer, while in Brave New World we follow Bernard Marx, a protagonist with a proper name. Both characters live in artificial worlds, separated from nature, and they recoil when they first encounter people who exist outside of the state’s constructed and controlled cities.
  • Although We is barely known compared to Orwell and Huxley’s later works, I’d argue that it’s among the best literary science fictions of all time, and it’s highly relevant, as it was when first written. Noam Chomsky calls it “more perceptive” than both 1984 and Brave New World. Zamyatin’s futuristic society was so on point, he was exiled from the Soviet Union because it was such an accurate description of life in a totalitarian regime, though he wrote it before Stalin took power.
  • Macaulay’s work is more subtle and funny than Huxley’s. Despite being a century old, What Not is remarkably relevant and readable, a satire that only highlights how little has changed in the years since its publication and how dangerous and absurd state policies can be. In this sense then, What Not reads more like George Orwell’s 1949 novel 1984 
  • Orwell was critical of Zamyatin’s technique. “[We] has a rather weak and episodic plot which is too complex to summarize,” he wrote. Still, he admired the work as a whole. “[Its] intuitive grasp of the irrational side of totalitarianism—human sacrifice, cruelty as an end in itself, the worship of a Leader who is credited with divine attributes—[…] makes Zamyatin’s book superior to Huxley’s,”
  • Like our own tech magnates and nations, the United State of We is obsessed with going to space.
  • Perhaps in 2019 Macaulay’s What Not, a clever and subversive book, will finally get its overdue recognition.
Ed Webb

Why Doesn't Anyone Pay Attention Anymore? | HASTAC - 0 views

  • We also need to distinguish what scientists know about human neurophysiology from our all-too-human discomfort with cultural and social change.  I've been an English professor for over twenty years and have heard how students don't pay attention, can't read a long novel anymore, and are in decline against some unspecified norm of an idealized past quite literally every year that I have been in this profession. In fact, how we educators should address this dire problem was the focus of the very first faculty meeting I ever attended.
  • Whenever I hear about attentional issues in debased contemporary society, whether blamed on television, VCR's, rock music, or the desktop, I assume that the critic was probably, like me, the one student who actually read Moby Dick and who had little awareness that no one else did.
  • This is not really a discussion about the biology of attention; it is about the sociology of change.
  • ...3 more annotations...
  • The brain is always changed by what it does.  That's how we learn, from infancy on, and that's how a baby born in New York has different cultural patterns of behavior, language, gesture, interaction, socialization, and attention than a baby born the same day in Beijing. That's as true for the historical moment into which we are born as it is for the geographical location.  Our attention is shaped by all we do, and reshaped by all we do.  That is what learning is.  The best we can do as educators is find ways to improve our institutions of learning to help our kids be prepared for their future--not for our past.
  • I didn't find the article nearly as stigmatizing and retrograde as I do the knee-jerk Don't Tread on Me reactions of everyone I've seen respond--most of which amount to foolish technolibertarian celebrations of the anonymous savior Technology (Cathy, you don't do that there, even if you also have nothing good to say about the NYT piece).If anything, the article showed that these kids (like all of us!) are profoundly distressed by today's media ecology. They seem to have a far more subtle perspective on things than most others. Frankly I'm a bit gobstopped that everyone hates this article so much. As for the old chestnut that "we need new education for the information age," it's worth pointing out that there was no formal, standardized education system before the industrial age. Compulsory education is a century old experiment. And yes, it ought to be discarded. But that's a frightening prospect for almost everyone, including those who advocate for it. I wonder how many of the intelligentsia who raise their fists and cry, "We need a different education system!" still partake of the old system for their own kids. We don't in my house, for what it's worth, and it's a huge pain in the ass.
  • Cathy -- I really appreciate the distinctions you make between the "the biology of attention" and "the sociology of change." And I agree that more complex and nuanced conversations about technology's relationship to attention, diverstion, focus, and immersion will be more productive (than either nostalgia or utopic futurism). For example, it seems like a strange oversight (in the NYT piece) to bemoan the ability of "kids these days" to focus, read immersively, or Pay Attention, yet report without comment that these same kids can edit video for hours on end -- creative, immersive work which, I would imagine, requires more than a little focus. It seems that perhaps the question is not whether we can still pay attention or focus, but what those diverse forms of immersion within different media (will) look like.
  •  
    I recommend both this commentary and the original NYT piece to which it links and on which it comments.
Ed Webb

The Web Means the End of Forgetting - NYTimes.com - 1 views

  • for a great many people, the permanent memory bank of the Web increasingly means there are no second chances — no opportunities to escape a scarlet letter in your digital past. Now the worst thing you’ve done is often the first thing everyone knows about you.
  • a collective identity crisis. For most of human history, the idea of reinventing yourself or freely shaping your identity — of presenting different selves in different contexts (at home, at work, at play) — was hard to fathom, because people’s identities were fixed by their roles in a rigid social hierarchy. With little geographic or social mobility, you were defined not as an individual but by your village, your class, your job or your guild. But that started to change in the late Middle Ages and the Renaissance, with a growing individualism that came to redefine human identity. As people perceived themselves increasingly as individuals, their status became a function not of inherited categories but of their own efforts and achievements. This new conception of malleable and fluid identity found its fullest and purest expression in the American ideal of the self-made man, a term popularized by Henry Clay in 1832.
  • the dawning of the Internet age promised to resurrect the ideal of what the psychiatrist Robert Jay Lifton has called the “protean self.” If you couldn’t flee to Texas, you could always seek out a new chat room and create a new screen name. For some technology enthusiasts, the Web was supposed to be the second flowering of the open frontier, and the ability to segment our identities with an endless supply of pseudonyms, avatars and categories of friendship was supposed to let people present different sides of their personalities in different contexts. What seemed within our grasp was a power that only Proteus possessed: namely, perfect control over our shifting identities. But the hope that we could carefully control how others view us in different contexts has proved to be another myth. As social-networking sites expanded, it was no longer quite so easy to have segmented identities: now that so many people use a single platform to post constant status updates and photos about their private and public activities, the idea of a home self, a work self, a family self and a high-school-friends self has become increasingly untenable. In fact, the attempt to maintain different selves often arouses suspicion.
  • ...20 more annotations...
  • All around the world, political leaders, scholars and citizens are searching for responses to the challenge of preserving control of our identities in a digital world that never forgets. Are the most promising solutions going to be technological? Legislative? Judicial? Ethical? A result of shifting social norms and cultural expectations? Or some mix of the above?
  • These approaches share the common goal of reconstructing a form of control over our identities: the ability to reinvent ourselves, to escape our pasts and to improve the selves that we present to the world.
  • many technological theorists assumed that self-governing communities could ensure, through the self-correcting wisdom of the crowd, that all participants enjoyed the online identities they deserved. Wikipedia is one embodiment of the faith that the wisdom of the crowd can correct most mistakes — that a Wikipedia entry for a small-town mayor, for example, will reflect the reputation he deserves. And if the crowd fails — perhaps by turning into a digital mob — Wikipedia offers other forms of redress
  • In practice, however, self-governing communities like Wikipedia — or algorithmically self-correcting systems like Google — often leave people feeling misrepresented and burned. Those who think that their online reputations have been unfairly tarnished by an isolated incident or two now have a practical option: consulting a firm like ReputationDefender, which promises to clean up your online image. ReputationDefender was founded by Michael Fertik, a Harvard Law School graduate who was troubled by the idea of young people being forever tainted online by their youthful indiscretions. “I was seeing articles about the ‘Lord of the Flies’ behavior that all of us engage in at that age,” he told me, “and it felt un-American that when the conduct was online, it could have permanent effects on the speaker and the victim. The right to new beginnings and the right to self-definition have always been among the most beautiful American ideals.”
  • In the Web 3.0 world, Fertik predicts, people will be rated, assessed and scored based not on their creditworthiness but on their trustworthiness as good parents, good dates, good employees, good baby sitters or good insurance risks.
  • “Our customers include parents whose kids have talked about them on the Internet — ‘Mom didn’t get the raise’; ‘Dad got fired’; ‘Mom and Dad are fighting a lot, and I’m worried they’ll get a divorce.’ ”
  • as facial-recognition technology becomes more widespread and sophisticated, it will almost certainly challenge our expectation of anonymity in public
  • Ohm says he worries that employers would be able to use social-network-aggregator services to identify people’s book and movie preferences and even Internet-search terms, and then fire or refuse to hire them on that basis. A handful of states — including New York, California, Colorado and North Dakota — broadly prohibit employers from discriminating against employees for legal off-duty conduct like smoking. Ohm suggests that these laws could be extended to prevent certain categories of employers from refusing to hire people based on Facebook pictures, status updates and other legal but embarrassing personal information. (In practice, these laws might be hard to enforce, since employers might not disclose the real reason for their hiring decisions, so employers, like credit-reporting agents, might also be required by law to disclose to job candidates the negative information in their digital files.)
  • research group’s preliminary results suggest that if rumors spread about something good you did 10 years ago, like winning a prize, they will be discounted; but if rumors spread about something bad that you did 10 years ago, like driving drunk, that information has staying power
  • many people aren’t worried about false information posted by others — they’re worried about true information they’ve posted about themselves when it is taken out of context or given undue weight. And defamation law doesn’t apply to true information or statements of opinion. Some legal scholars want to expand the ability to sue over true but embarrassing violations of privacy — although it appears to be a quixotic goal.
  • Researchers at the University of Washington, for example, are developing a technology called Vanish that makes electronic data “self-destruct” after a specified period of time. Instead of relying on Google, Facebook or Hotmail to delete the data that is stored “in the cloud” — in other words, on their distributed servers — Vanish encrypts the data and then “shatters” the encryption key. To read the data, your computer has to put the pieces of the key back together, but they “erode” or “rust” as time passes, and after a certain point the document can no longer be read.
  • Plenty of anecdotal evidence suggests that young people, having been burned by Facebook (and frustrated by its privacy policy, which at more than 5,000 words is longer than the U.S. Constitution), are savvier than older users about cleaning up their tagged photos and being careful about what they post.
  • norms are already developing to recreate off-the-record spaces in public, with no photos, Twitter posts or blogging allowed. Milk and Honey, an exclusive bar on Manhattan’s Lower East Side, requires potential members to sign an agreement promising not to blog about the bar’s goings on or to post photos on social-networking sites, and other bars and nightclubs are adopting similar policies. I’ve been at dinners recently where someone has requested, in all seriousness, “Please don’t tweet this” — a custom that is likely to spread.
  • There’s already a sharp rise in lawsuits known as Twittergation — that is, suits to force Web sites to remove slanderous or false posts.
  • strategies of “soft paternalism” that might nudge people to hesitate before posting, say, drunken photos from Cancún. “We could easily think about a system, when you are uploading certain photos, that immediately detects how sensitive the photo will be.”
  • It’s sobering, now that we live in a world misleadingly called a “global village,” to think about privacy in actual, small villages long ago. In the villages described in the Babylonian Talmud, for example, any kind of gossip or tale-bearing about other people — oral or written, true or false, friendly or mean — was considered a terrible sin because small communities have long memories and every word spoken about other people was thought to ascend to the heavenly cloud. (The digital cloud has made this metaphor literal.) But the Talmudic villages were, in fact, far more humane and forgiving than our brutal global village, where much of the content on the Internet would meet the Talmudic definition of gossip: although the Talmudic sages believed that God reads our thoughts and records them in the book of life, they also believed that God erases the book for those who atone for their sins by asking forgiveness of those they have wronged. In the Talmud, people have an obligation not to remind others of their past misdeeds, on the assumption they may have atoned and grown spiritually from their mistakes. “If a man was a repentant [sinner],” the Talmud says, “one must not say to him, ‘Remember your former deeds.’ ” Unlike God, however, the digital cloud rarely wipes our slates clean, and the keepers of the cloud today are sometimes less forgiving than their all-powerful divine predecessor.
  • On the Internet, it turns out, we’re not entitled to demand any particular respect at all, and if others don’t have the empathy necessary to forgive our missteps, or the attention spans necessary to judge us in context, there’s nothing we can do about it.
  • Gosling is optimistic about the implications of his study for the possibility of digital forgiveness. He acknowledged that social technologies are forcing us to merge identities that used to be separate — we can no longer have segmented selves like “a home or family self, a friend self, a leisure self, a work self.” But although he told Facebook, “I have to find a way to reconcile my professor self with my having-a-few-drinks self,” he also suggested that as all of us have to merge our public and private identities, photos showing us having a few drinks on Facebook will no longer seem so scandalous. “You see your accountant going out on weekends and attending clown conventions, that no longer makes you think that he’s not a good accountant. We’re coming to terms and reconciling with that merging of identities.”
  • a humane society values privacy, because it allows people to cultivate different aspects of their personalities in different contexts; and at the moment, the enforced merging of identities that used to be separate is leaving many casualties in its wake.
  • we need to learn new forms of empathy, new ways of defining ourselves without reference to what others say about us and new ways of forgiving one another for the digital trails that will follow us forever
Ed Webb

Hayabusa2 and the unfolding future of space exploration | Bryan Alexander - 0 views

  • What might this tell us about the future?  Let’s consider Ryugu as a datapoint or story for where space exploration might head next.
  • There isn’t a lot of press coverage beyond Japan (ah, once again I wish I read Japanese), if I go by Google News headlines.  There’s nothing on the CNN.com homepage now, other than typical spatters of dread and celebrity; the closest I can find is a link to a story about Musk’s space tourism project, which a Japanese billionaire will ride.  Nothing on Fox News or MSNBC’s main pages.  BBC News at least has a link halfway down its main page.
  • Hayabusa is a Japanese project, not an American one, and national interest counts for a lot.  No humans were involved, so human interest and story are absent.  Perhaps the whole project looks too science-y for a culture that spins into post-truthiness, contains some serious anti-science and anti-technology strands, or just finds science stories too dry.  Or maybe the American media outlets think Americans just aren’t that into space in particular in 2018.
  • ...13 more annotations...
  • Hayabusa2 reminds us that space exploration is more multinational and more disaggregated than ever.  Besides JAXA there are space programs being build up by China and India, including robot craft, astronauts (taikonauts, for China, vyomanauts, for India), and space stations.  The Indian Mars Orbiter still circles the fourth planet. The European Space Agency continues to develop satellites and launch rockets, like the JUICE (JUpiter ICy moons Explorer).  Russia is doing some mixture of commercial spaceflight, ISS maintenance, exploration, and geopoliticking.  For these nations space exploration holds out a mixture of prestige, scientific and engineering development, and possible commercial return.
  • Bezos, Musk, and others live out a Robert Heinlein story by building up their own personal space efforts.  This is, among other things, a sign of how far American wealth has grown, and how much of the elite are connected to technical skills (as opposed to inherited wealth).  It’s an effect of plutocracy, as I’ve said before.  Yuri Milner might lead the first interstellar mission with his Breakthrough Starshot plan.
  • Privatization of space seems likely to continue.
  • Uneven development is also likely, as different programs struggle to master different stations in the space path.  China may assemble a space station while Japan bypasses orbital platforms for the moon, private cubesats head into the deep solar system and private companies keep honing their Earth orbital launch skills.
  • Surely the challenges of getting humans and robots further into space will elicit interesting projects that can be used Earthside.  Think about health breakthroughs needed to keep humans alive in environments scoured by radiation, or AI to guide robots through complex situations.
  • robots continue to be cheap, far easier to operate, capable of enduring awful stresses, and happy to send gorgeous data back our way
  • Japan seems committed to creating a lunar colony.  Musk and Bezos burn with the old science fiction and NASA hunger for shipping humans into the great dark.  The lure of Mars seems to be a powerful one, and a multinational, private versus public race could seize the popular imagination.  Older people may experience a rush of nostalgia for the glorious space race of their youth.
  • This competition could turn malign, of course.  Recall that the 20th century’s space races grew out of warfare, and included many plans for combat and destruction. Nayef Al-Rodhan hints at possible strains in international cooperation: The possible fragmentation of outer space research activities in the post-ISS period would constitute a break-up of an international alliance that has fostered unprecedented cooperation between engineers and scientists from rival geopolitical powers – aside from China. The ISS represents perhaps the pinnacle of post-Cold War cooperation and has allowed for the sharing and streamlining of work methods and differing norms. In a current period of tense relations, it is worrying that the US and Russia may be ending an important phase of cooperation.
  • Space could easily become the ground for geopolitical struggles once more, and possibly a flashpoint as well.  Nationalism, neonationalism, nativism could power such stresses
  • Enough of an off-Earth settlement could lead to further forays, once we bypass the terrible problem of getting off the planet’s surface, and if we can develop new ways to fuel and sustain craft in space.  The desire to connect with that domain might help spur the kind of space elevator which will ease Earth-to-orbit challenges.
  • The 1960s space race saw the emergence of a kind of astronaut cult.  The Soviet space program’s Russian roots included a mystical tradition.  We could see a combination of nostalgia from older folks and can-do optimism from younger people, along with further growth in STEM careers and interest.  Dialectically we should expect the opposite.  A look back at the US-USSR space race shows criticism and opposition ranging from the arts (Gil Scott-Heron’s “Whitey on the Moon”, Jello Biafra’s “Why I’m Glad the Space Shuttle Blew Up”) to opinion polls (in the US NASA only won real support for the year around Apollo 11, apparently).  We can imagine all kinds of political opposition to a 21st century space race, from people repeating the old Earth versus space spending canard to nationalistic statements (“Let Japan land on Deimos.  We have enough to worry about here in Chicago”) to environmental concerns to religious ones.  Concerns about vast wealth and inequality could well target space.
  • How will we respond when, say, twenty space tourists crash into a lunar crater and die, in agony, on YouTube?
  • That’s a lot to hang on one Japanese probe landing two tiny ‘bots on an asteroid in 2018, I know.  But Hayabusa2 is such a signal event that it becomes a fine story to think through.
Ed Webb

Glenn Greenwald: How America's Surveillance State Breeds Conformity and Fear | Civil Li... - 0 views

  • The Surveillance State hovers over any attacks that meaningfully challenge state-appropriated power. It doesn’t just hover over it. It impedes it, it deters it and kills it.  That’s its intent. It does that by design.
  • the realization quickly emerged that, allowing government officials to eavesdrop on other people, on citizens, without constraints or oversight, to do so in the dark, is a power that gives so much authority and leverage to those in power that it is virtually impossible for human beings to resist abusing that power.  That’s how potent of a power it is.
  • If a dictator takes over the United States, the NSA could enable it to impose total tyranny, and there would be no way to fight back.
  • ...23 more annotations...
  • Now it’s virtually a religious obligation to talk about the National Security State and its close cousin, the Surveillance State, with nothing short of veneration.
  • The NSA, beginning 2001, was secretly ordered to spy domestically on the communications of American citizens. It has escalated in all sorts of lawless, and now lawful ways, such that it is now an enormous part of what that agency does. Even more significantly, the technology that it has developed is now shared by a whole variety of agencies, including the FBI
  • Now, the Patriot Act is completely uncontroversial. It gets renewed without any notice every three years with zero reforms, no matter which party is in control.
  • hey are two, as I said, established Democrats warning that the Democratic control of the Executive branch is massively abusing this already incredibly broad Patriot Act. And one of the things they are trying to do is extract some basic information from the NSA about what it is they’re doing in terms of the surveillance on the American people.  Because even though they are on the Intelligence Committee, they say they don’t even know the most basic information about what the NSA does including even how many Americans have had their e-mails read or had their telephone calls intercepted by the NSA.
  • "We can’t tell you how many millions of Americans are having their e-mails read by us, and their telephone calls listened in by us, because for us to tell you that would violate the privacy of American citizens."
  • An article in Popular Mechanics in 2004 reported on a study of American surveillance and this is what it said: “There are an estimated 30 million surveillance cameras now deployed in the United States shooting 4 billion hours of footage a week. Americans are being watched. All of us, almost everywhere.” There is a study in 2006 that estimated that that number would quadruple to 100 million cameras -- surveillance cameras -- in the United States within five years largely because of the bonanza of post-9/11 surveilling. 
  • it’s not just the government that is engaged in surveillance, but just as menacingly, corporations, private corporations, engage in huge amounts of surveillance on us. They give us cell phones that track every moment of where we are physically, and then provide that to law enforcement agencies without so much as a search warrant.  Obviously, credit card and banking transactions are reported, and tell anyone who wants to know everything we do. We talk about the scandal of the Bush eavesdropping program that was not really a government eavesdropping program, so much as it was a private industry eavesdropping program. It was done with the direct and full cooperation of AT&T, Sprint, Verizon and the other telecom giants. In fact, when you talk about the American Surveillance State, what you’re really talking about is no longer public government agencies. What you’re talking about is a full-scale merger between the federal government and industry. That is what the Surveillance State is
  • The principle being that there can be no human interaction, especially no human communication, not just for international between foreign nations but by America citizens on American soils that is beyond the reach of the U.S. government.
  • at exactly the same time that the government has been massively expanding its ability to know everything that we’re doing it has simultaneously erected a wall of secrecy around it that prevents us from knowing anything that they’re doing
  • government now operates with complete secrecy, and we have none
  • it makes people believe that they’re free even though they’ve been subtly convinced that there are things that they shouldn’t do that they might want to do
  • what has happened in the last three to four years is a radical change in the war on terror. The war on terror has now been imported into United States policy. It is now directed at American citizens on American soil. So rather than simply sending drones to foreign soil to assassinate foreign nationals, we are now sending drones to target and kill American citizens without charges or trial. Rather than indefinitely detaining foreign nationals like Guantanamo, Congress last year enacted, and President Obama signed, the National Defense Authorization Act that permits the detention -- without trial, indefinitely -- of American citizens on U.S. soil.
  • this is what the Surveillance State is designed to do.  It’s justified, in the name of terrorism, of course that’s the packaging in which it’s wrapped, that’s been used extremely, and in all sorts of ways, since 9/11 for domestic application. And that’s being, that’s happening even more. It’s happening in terms of the Occupy movement and the infiltration that federal officials were able to accomplish using Patriot Act authorities. It’s happened with pro-Palestinian activists in the United States and all other dissident groups that have themselves [dealt with] with surveillance and law enforcement used by what was originally the war on terror powers.
  • if the government is able to know what we speak about and know who we’re talking to, know what it is that we’re planning, it makes any kind of activism extremely difficult. Because secrecy and privacy are prerequisites to effective actions.
  • we are training our young citizens to live in a culture where the expect they are always being watched. And we want them to be chilled, we want them to be deterred, we want them not to ever challenge orthodoxy or to explore limits where engaging creativity in any kind. This type of surveillance, by design, breeds conformism.  That’s its purpose. that’s what makes surveillance so pernicious.
  • f you go and speak to communities of American Muslims is you find an incredibly pervasive climate of fear.
  • This climate of fear creates limits around the behavior in which they’re willing to engage in very damaging ways
  • governments, when they want to give themselves abusive and radical powers, typically first target people who they think their citizens won’t care very much about, because they’ll think they’re not affected by it
  • the psychological effects on East German people endure until today. The way in which they have been trained for decades to understand that there are limits to their life, even once you remove the limits, they’ve been trained that those are not things they need to transgress.
  • Rosa Luxembourg said, “He who does not move does not notice his chains.”
  • You can acculturate people to believing that tyranny is freedom, that their limits are actually emancipations and freedom, that is what this Surveillance State does, by training people  to accept their own conformity that they are actually free, that they no longer even realize the ways in which they’re being limited.
  • important means of subverting this one-way mirror that I’ve described is forcible, radical transparency. It’s one of the reasons I support, so enthusiastically and unqualafiably, groups like Anonymous and WikiLeaks. I want holes to be blown in the wall of secrecy.
  • There are things like the Tor project and other groups that enable people to use the internet without any detection from government authorities. That has the effect of preventing regimes that actually bar their citizens from using the Internet from doing so since you can no longer trace the origins of the Internet user. But it also protects people who live in countries like ours where the government is trying to constantly monitor what we do by sending our communications through multiple proxies around the world that can’t be invaded. There’s really a war taking place: an arms race where the government and these groups are attempting to stay one tactical step ahead of the other. In terms of ability to shield internet communications from the government and the government’s ability to invade them and participating in this war in ways that are supportive of the “good side” are really critical as is veiling yourself from the technology that exists, to make what you do as tight as possible.
Ed Webb

Everybody Blogs - Brainstorm - The Chronicle of Higher Education - 0 views

  • Pete said “everybody blogs” in the same tone of voice that people use when they refer to the children’s book Everybody Poops, thereby making blogging seem as if it’s just as thoughtful and intellectual of an activity as the subject of that children’s classic.
  • the blogs I read regularly have far more to offer. Some I read for information, some to get me thinking about a topic and to inform me what others have already thought, some to amuse me, some to delight me, some to make me angry, and some because I am trying to find yet another way to distract me from finishing those introductions
Ed Webb

Could self-aware cities be the first forms of artificial intelligence? - 1 views

  • People have speculated before about the idea that the Internet might become self-aware and turn into the first "real" A.I., but could it be more likely to happen to cities, in which humans actually live and work and navigate, generating an even more chaotic system?
  • "By connecting and providing visibility into disparate systems, cities and buildings can operate like living organisms, sensing and responding quickly to potential problems before they occur to protect citizens, save resources and reduce energy consumption and carbon emissions," reads the invitation to IBM's PULSE 2010 event.
  • And Cisco is already building the first of these smart cities: Songdo, a Korean "instant city," which will be completely controlled by computer networks — including ubiquitious Telepresence applications, video screens which could be used for surveillance. Cisco's chief globalization officer, Wim Elfrink, told the San Jose Mercury News: Everything will be connected - buildings, cars, energy - everything. This is the tipping point. When we start building cities with technology in the infrastructure, it's beyond my imagination what that will enable.
  • ...9 more annotations...
  • Urbanscale founder Adam Greenfield has written a lot about ubiquitous computing in urban environments, most notably in 2006's Everyware, which posits that computers will "effectively disappear" as objects around us become "smart" in ways that are nearly invisible to lay-people.
  • tailored advertising just about anywhere
  • Some futurists are still predicting that cities will become closer to arcologies — huge slabs of integrated urban life, like a whole city in a single block — as they grapple with the need to house so many people in an efficient fashion. The implications for heating and cooling an arcology, let alone dealing with waste disposal, are mind-boggling. Could a future arcology become our first machine mind?
  • Science fiction gives us the occasional virtual worlds that look rural — like Doctor Who's visions of life inside the Matrix, which mostly looks (not surprisingly) like a gravel quarry — but for the most part, virtual worlds are always urban
  • So here's why cities might have an edge over, say, the Internet as a whole, when it comes to developing self awareness. Because every city is different, and every city has its own identity and sense of self — and this informs everything from urban planning to the ways in which parking and electricity use are mapped out. The more sophisticated the integrated systems associated with a city become, the more they'll reflect the city's unique personality, and the more programmers will try to imbue their computers with a sense of this unique urban identity. And a sense of the city's history, and the ways in which the city has evolved and grown, will be important for a more sophisticated urban planning system to grasp the future — so it's very possible to imagine this leading to a sense of personal history, on the part of a computer that identifies with the city it helps to manage.
  • next time you're wandering around your city, looking up at the outcroppings of huge buildings, the wild tides of traffic and the frenzy of construction and demolition, don't just think of it as a place haunted by history. Try, instead, to imagine it coming to life in a new way, opening its millions of electronic eyes, and greeting you with the first gleaming of independent thought
  • I can't wait for the day when city AI's decide to go to war with other city AI's over allocation of federal funds.
  • John Shirley has San Fransisco as a sentient being in City Come A Walkin
  • I doubt cities will ever be networked so smoothly... they are all about fractions, sections, niches, subcultures, ethicities, neighborhoods, markets, underground markets. It's literally like herding cats... I don't see it as feasible. It would be a schizophrenic intelligence at best. Which, Wintermute was I suppose...
  •  
    This is beginning to sound just like the cities we have read about. To me it sort of reminds me of the Burning chrome stories, as an element in all those stories was machines and technology at every turn. With the recent advances is technology it is alarming to see that an element in many science fiction tales is finally coming true. A city that acts as a machine in its self. Who is to say that this city won't become a city with a highly active hacker underbelly.
Ed Webb

British Art Robots | Beyond The Beyond - 0 views

  •  
    If only for his love of robots, Bruce's blog is a must-read.
Ed Webb

BBC News - A Point of View: Why Orwell was a literary mediocrity - 0 views

  •  
    I often disagree with Will Self, who I find unbearably pretentious. But reading those we disagree with is can be very productive.
Ed Webb

Scientific blinders: Learning from the moral failings of Nazi physicists - Bulletin of ... - 0 views

  • As the evening progressed, more and more questions concerning justice and ethics occurred to the physicists: Are atomic weapons inherently inhumane, and should they never be used? If the Germans had come to possess such weapons, what would be the world’s fate? What constitutes real patriotism in Nazi Germany—working for the regime’s success, or its defeat? The scientists expressed surprise and bafflement at their colleagues’ opinions, and their own views sometimes evolved from one moment to the next. The scattered, changing opinions captured in the Farm Hall transcripts highlight that, in their five years on the Nazi nuclear program, the German physicists had likely failed to wrestle meaningfully with these critical questions.
  • looking back at the Uranium Club serves to remind us scientists of how easy it is to focus on technical matters and avoid considering moral ones. This is especially true when the moral issues are perplexing, when any negative impacts seem distant, and when the science is exciting.
  • engineers who develop tracking or facial-recognition systems may be creating tools that can be purchased by repressive regimes intent on spying on and suppressing dissent. Accordingly, those researchers have a certain obligation to consider their role and the impact of their work.
  • ...2 more annotations...
  • reflecting seriously on the societal context of a research position may prompt a scientist to accept the job—and to take it upon herself or himself to help restrain unthinking innovation at work, by raising questions about whether every feature that can be added should in fact be implemented. (The same goes for whether certain lines of research should be pursued and particular findings published.)
  • The challenge for each of us, moving forward, is to ask ourselves and one another, hopefully far earlier in the research process than did Germany’s Walther Gerlach: “What are we working for?”
  •  
    If you get the opportunity see, or at least read, the plays The Physicists (Die Physiker) by Friedrich Dürrenmatt and Copenhagen by Michael Frayn.
Ed Webb

Can Sci-Fi Writers Prepare Us for an Uncertain Future? | WIRED - 0 views

  • a growing contingent of sci-fi writers being hired by think tanks, politicians, and corporations to imagine—and predict—the future
  • Harvard Business Review made the corporate case for reading sci-fi years ago, and mega consulting firm Price Waterhouse Cooper published a guide on how to use sci-fi to “explore innovation.” The New Yorker has touted “better business through sci-fi.” As writer Brian Merchant put it, “Welcome to the Sci-Fi industrial complex.”
  • The use of sci-fi has bled into government and public policy spheres. The New America Foundation recently held an all-day event discussing “What Sci-Fi Futures Can (and Can't) Teach Us About AI Policy.” And Nesta, an organization that generates speculative fiction, has committed $24 million to grow “new models of public services” in collaboration with the UK government
  • ...8 more annotations...
  • Some argue that there is power in narrative stories that can’t be found elsewhere. Others assert that in our quest for imagination and prediction, we’re deluding ourselves into thinking that we can predict what’s coming
  • The World Future Society and the Association of Professional Futurists represent a small but growing group of professionals, many of whom have decades of experience thinking about long-term strategy and “scenario planning”—a method used by organizations to try and prepare for possible futures.
  • true Futurism is often pretty unsexy. It involves sifting through a lot of data and research and models and spreadsheets. Nobody is going to write a profile of your company or your government project based on a dry series of models outlining carefully caveated possibilities. On the other hand, worldbuilding—the process of imagining a universe in which your fictional stories can exist—is fun. People want stories, and science fiction writers can provide them.
  • Are those who write epic space operas (no matter how good those space operas might be) really the right people to ask about the future of work or water policy or human rights?
  • critics worry that writers are so good at spinning stories that they might even convince you those stories are true. In actuality, history shows us that predictions are nearly impossible to make and that humans are catastrophically bad at guessing what the future will hold
  • it's important to distinguish between prediction and impact. Did Star Trek anticipate the cell phone, or were the inventors of the cell phone inspired by Star Trek? Listicles of “all the things sci-fi has predicted” are largely exercises in cherry picking—they never include the things that sci-fi got wrong
  • In this line of work, specifics matter. It’s one thing to write a book about a refugee crisis, but quite another to predict exactly how the Syrian refugee crisis unfolded
  • It’s tempting to turn to storytelling in times of crisis—and it’s hard to argue that we’re not in a time of crisis now. Within dystopian pieces of fiction there are heroes and survivors, characters we can identify with who come out the other side and make it out OK. Companies and governments and individuals all want to believe that they will be among those lucky few, the heroes of the story. And science fiction writers can deliver that, for a fee.
Ed Webb

At age 13, I joined the alt-right, aided by Reddit and Google - 0 views

  • Now, I’m 16, and I’ve been able to reflect on how I got sucked into that void—and how others do, too. My brief infatuation with the alt-right has helped me understand the ways big tech companies and their algorithms are contributing to the problem of radicalization—and why it’s so important to be skeptical of what you read online.
  • while a quick burst of radiation probably won’t give you cancer, prolonged exposure is far more dangerous. The same is true for the alt-right. I knew that the messages I was seeing were wrong, but the more I saw them, the more curious I became. I was unfamiliar with most of the popular discussion topics on Reddit. And when you want to know more about something, what do you do? You probably don’t think to go to the library and check out a book on that subject, and then fact check and cross reference what you find. If you just google what you want to know, you can get the information you want within seconds.
  • I started googling things like “Illegal immigration,” “Sandy Hook actors,” and “Black crime rate.” And I found exactly what I was looking for.
  • ...11 more annotations...
  • The articles and videos I first found all backed up what I was seeing on Reddit—posts that asserted a skewed version of actual reality, using carefully selected, out-of-context, and dubiously sourced statistics that propped up a hateful world view. On top of that, my online results were heavily influenced by something called an algorithm. I understand algorithms to be secretive bits of code that a website like YouTube will use to prioritize content that you are more likely to click on first. Because all of the content I was reading or watching was from far-right sources, all of the links that the algorithms dangled on my screen for me to click were from far-right perspectives.
  • I spent months isolated in my room, hunched over my computer, removing and approving memes on Reddit and watching conservative “comedians” that YouTube served up to me.
  • The inflammatory language and radical viewpoints used by the alt-right worked to YouTube and Google’s favor—the more videos and links I clicked on, the more ads I saw, and in turn, the more ad revenue they generated.
  • the biggest step in my recovery came when I attended a pro-Trump rally in Washington, D.C., in September 2017, about a month after the “Unite the Right” rally in Charlottesville, Virginia
  • The difference between the online persona of someone who identifies as alt-right and the real thing is so extreme that you would think they are different people. Online, they have the power of fake and biased news to form their arguments. They sound confident and usually deliver their standard messages strongly. When I met them in person at the rally, they were awkward and struggled to back up their statements. They tripped over their own words, and when they were called out by any counter protestors in the crowd, they would immediately use a stock response such as “You’re just triggered.”
  • Seeing for myself that the people I was talking to online were weak, confused, and backwards was the turning point for me.
  • we’re too far gone to reverse the damage that the alt-right has done to the internet and to naive adolescents who don’t know any better—children like the 13-year-old boy I was. It’s convenient for a massive internet company like Google to deliberately ignore why people like me get misinformed in the first place, as their profit-oriented algorithms continue to steer ignorant, malleable people into the jaws of the far-right
  • Dylann Roof, the white supremacist who murdered nine people in a Charleston, South Carolina, church in 2015, was radicalized by far-right groups that spread misinformation with the aid of Google’s algorithms.
  • Over the past couple months, I’ve been getting anti-immigration YouTube ads that feature an incident presented as a “news” story, about two immigrants who raped an American girl. The ad offers no context or sources, and uses heated language to denounce immigration and call for our county to allow ICE to seek out illegal immigrants within our area. I wasn’t watching a video about immigration or even politics when those ads came on; I was watching the old Monty Python “Cheese Shop” sketch. How does British satire, circa 1972, relate to America’s current immigration debate? It doesn’t.
  • tech companies need to be held accountable for the radicalization that results from their systems and standards.
  • anyone can be manipulated like I was. It’s so easy to find information online that we collectively forget that so much of the content the internet offers us is biased
1 - 20 of 42 Next › Last »
Showing 20 items per page