Skip to main content

Home/ TOK Friends/ Group items tagged rigid

Rss Feed Group items tagged

Javier E

The Crowd Pleaser - NYTimes.com - 0 views

  • Obama seems self-sufficient while Romney seems other-directed.
  • I’m borrowing the phrase “other-directed” from David Riesman’s 1950 classic, “The Lonely Crowd.”
  • Riesman argued that different eras nurture different personality types. The agricultural economy nurtured tradition-directed individuals. People lived according to the ancient cycles, customs and beliefs. Children grew up and performed the same roles as their parents.
  • ...2 more annotations...
  • The industrial era favored the inner-directed personality type. The inner-directed person was guided by a set of strong internal convictions, like Victorian morality. The inner-directed person was a hardy pioneer, the stolid engineer or the resilient steelworker — working on physical things. This person was often rigid, but also steadfast.
  • The other-directed personality type emerges in a service or information age economy. In this sort of economy, most workers are not working with physical things; they are manipulating people. The other-directed person becomes adept at pleasing others, at selling him or herself. The other-directed person is attuned to what other people want him to be. The other-directed person is a pliable member of a team and yearns for acceptance. He or she is less notable for having a rigid character than for having a smooth personality.
Javier E

Book Club: A Guide To Living « The Dish - 0 views

  • He proves nothing that he doesn’t simultaneously subvert a little; he makes no over-arching argument about the way humans must live; he has no logician’s architecture or religious doctrine. He slips past all those familiar means of telling other people what’s good for them, and simply explains what has worked for him and others and leaves the reader empowered to forge her own future
  • You can see its eccentric power by considering the alternative ways of doing what Montaigne was doing. Think of contemporary self-help books – and all the fake certainty and rigid formulae they contain. Or think of a hideous idea like “the purpose-driven life” in which everything must be forced into the box of divine guidance in order to really live at all. Think of the stringency of Christian disciplines – say, the spiritual exercises of Ignatius of Loyola – and marvel at how Montaigne offers an entirely different and less compelling way to live. Think of the rigidity of Muslim practice and notice how much lee-way Montaigne gives to sin
  • This is a non-philosophical philosophy. It is a theory of practical life as told through one man’s random and yet not-so-random reflections on his time on earth. And it is shot through with doubt. Even the maxims that Montaigne embraces for living are edged with those critical elements of Montaigne’s thought that say “as far as I know”
  • ...4 more annotations...
  • Is this enough? Or is it rather a capitulation to relativism, a manifesto for political quietism, a worldview that treats injustice as something to be abhorred but not constantly fought against? This might be seen as the core progressive objection to the way of Montaigne. Or is his sensibility in an age of religious terror and violence and fanaticism the only ultimate solution we have?
  • here’s what we do know. We are fallible beings; we have nothing but provisional knowledge; and we will die. And this is enough. This does not mean we should give up inquiring or seeking to understand. Skepticism is not nihilism. It doesn’t posit that there is no truth; it merely notes that if truth exists, it is inherently beyond our ultimate grasp. And accepting those limits is the first step toward sanity, toward getting on with life. This is what I mean by conservatism.
  • you can find in philosophy any number of clues about how to live; you can even construct them into an ideology that explains all of human life and society – like Marxism or free market fundamentalism or a Nietzschean will to power. But as each totalist system broke down upon my further inspection, I found myself returning to Montaigne and the tradition of skepticism he represents
  • If I were to single out one theme of Montaigne’s work that has stuck with me, it would be this staring of death in the face, early and often, and never flinching. It is what our culture refuses to do much of the time, thereby disempowering us in the face of our human challenges.
silveiragu

BBC - Future - The countries that don't exist - 2 views

  • In the deep future, every territory we know could eventually become a country that doesn’t exist.
    • silveiragu
       
      Contrary to the human expectation that situations remain constant. 
  • There really is a secret world of hidden independent nations
  • Middleton, however, is here to talk about countries missing from the vast majority of books and maps for sale here. He calls them the “countries that don’t exist”
    • silveiragu
       
      Reminds us of our strange relationship with nationalism-that we forget how artificial countries' boundaries are. 
  • ...21 more annotations...
  • The problem, he says, is that we don’t have a watertight definition of what a country is. “Which as a geographer, is kind of shocking
  • The globe, it turns out, is full of small (and not so small) regions that have all the trappings of a real country
  • and are ignored on most world maps.
  • Middleton, a geographer at the University of Oxford, has now charted these hidden lands in his new book, An Atlas of Countries that Don’t Exist
  • Middleton’s quest began, appropriately enough, with Narnia
    • silveiragu
       
      Interesting connection to imagination as a way of knowing.
  • a defined territory, a permanent population, a government, and “the capacity to enter into relations with other states”.
  • In Australia, meanwhile, the Republic of Murrawarri was founded in 2013, after the indigenous tribe wrote a letter to Queen Elizabeth II asking her to prove her legitimacy to govern their land.
  • Yet many countries that meet these criteria aren‘t members of the United Nations (commonly accepted as the final seal of a country’s statehood).
  • many of them are instead members of the “Unrepresented United Nations – an alternative body to champion their rights.
  • A handful of the names will be familiar to anyone who has read a newspaper: territories such as Taiwan, Tibet, Greenland, and Northern Cyprus.
  • The others are less famous, but they are by no means less serious
    • silveiragu
       
      By what criterion, "serious"?
  • One of the most troubling histories, he says, concerns the Republic of Lakotah (with a population of 100,000). Bang in the centre of the United States of America (just east of the Rocky Mountains), the republic is an attempt to reclaim the sacred Black Hills for the Lakota Sioux tribe.
  • Their plight began in the 18th Century, and by 1868 they had finally signed a deal with the US government that promised the right to live on the Black Hills. Unfortunately, they hadn’t accounted for a gold rush
  • Similar battles are being fought across every continent.
  • In fact, you have almost certainly, unknowingly, visited one.
  • Christiania, an enclave in the heart of Copenhagen.
  • On 26 September that year, they declared it independent, with its own “direct democracy”, in which each of the inhabitants (now numbering 850) could vote on any important matter.
    • silveiragu
       
      Interesting reminder that the label "country" does not only have to arise from military or economic struggles, as is tempting to think in our study of history. Also, interesting reminder that the label of "country"-by itself-means nothing. 
  • a blind eye to the activities
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. One interpretation of the Danish government's response to Christiania is simply that they do not know what to think. Although probably not geopolitically significant, such enclave states represent a challenge our perception of countries, one which fascinates Middleton's readers because it disconcerts them. 
  • perhaps we need to rethink the concept of the nation-state altogether? He points to Antarctica, a continent shared peacefully among the international community
    • silveiragu
       
      A sign of progress, perhaps, from the industrialism-spurred cycle of divide land, industrialize, and repeat-even if the chief reason is the region's climate. 
  • The last pages of Middleton’s Atlas contain two radical examples that question everything we think we mean by the word ‘country’.
    • silveiragu
       
      That is really why any interest is demonstrated towards this topic. Not that some country named Christiania exists in the heart of Denmark, but that they can legitimately call themselves a nation. We have grown up, and our parents have grown up, with a rigid definition of nationalism, and the strange notion that the lines in an atlas were always there. These "nonexistent countries"-and our collective disregard for them-are reminiscent of the 17th and 18th centuries: then, the notion of identifying by national lines was almost as strange and artificial as these countries' borders seem to us today. 
  • “They all raise the possibility that countries as we know them are not the only legitimate basis for ordering the planet,
Javier E

We need a major redesign of life - The Washington Post - 0 views

  • Thirty years were added to average life expectancy in the 20th century, and rather than imagine the scores of ways we could use these years to improve quality of life, we tacked them all on at the end. Only old age got longer.
  • As a result, most people are anxious about the prospect of living for a century. Asked about aspirations for living to 100, typical responses are “I hope I don’t outlive my money” or “I hope I don’t get dementia.”
  • Each generation is born into a world prepared by its ancestors with knowledge, infrastructure and social norms. The human capacity to benefit from this inherited culture afforded us such extraordinary advantages that premature death was dramatically reduced in a matter of decades. Yet as longevity surged, culture didn’t keep up
  • ...11 more annotations...
  • Long lives are not the problem. The problem is living in cultures designed for lives half as long as the ones we have.
  • Retirements that span four decades are unattainable for most individuals and governments; education that ends in the early 20s is ill-suited for longer working lives; and social norms that dictate intergenerational responsibilities between parents and young children fail to address families that include four or five living generations.
  • Last year, the Stanford Center on Longevity launched an initiative called “The New Map of Life.”
  • it would be a mistake to replace the old rigid model of life — education first, then family and work, and finally retirement — with a new model just as rigid
  • there should be many different routes, interweaving leisure, work, education and family throughout life, taking people from birth to death with places to stop, rest, change courses and repeat steps along the way.
  • To thrive in an age of rapid knowledge transfer, children not only need reading, math and computer literacy, but they also need to learn to think creatively and not hold on to “facts” too tightly. They’ll need to find joy in unlearning and relearning.
  • Teens could take breaks from high school and take internships in workplaces that intrigue them. Education wouldn’t end in youth but rather be ever-present and take many forms outside of classrooms, from micro-degrees to traveling the world.
  • There is good reason to think we will work longer, but we can improve work quality with shorter workweeks, flexible scheduling and frequent “retirements.”
  • Rather than saving ever-larger pots of money for the end of life, we could pool risks in new ways
  • Generations may share wealth earlier than traditional bequests; we can start savings accounts at birth and allow young adults to work earlier so that compound interest can work in their favor.
  • Maintaining physical fitness from the beginning to end of life will be paramount. Getting children outside, encouraging sports, reducing the time we sit, and spending more time walking and moving will greatly improve individual lives.
Javier E

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
Javier E

Meeting 'the Other' Face to Face - The New York Times - 0 views

  • Sitting in a conference room at a hotel near the Massachusetts Institute of Technology here, I slip on large headphones and an Oculus Rift virtual reality headset and wriggle into the straps of a backpack, weighed down with a computer and a battery.
  • when I stand, I quickly find myself in a featureless all-white room, a kind of Platonic vestibule. On the walls at either end are striking poster-size black-and-white portraits taken by the noted Belgian-Tunisian photographer Karim Ben Khelifa, one showing a young Israeli soldier and another a Palestinian fighter about the same age, whose face is almost completely hidden by a black hood.
  • Then the portraits disappear, replaced by doors, which open. In walk the two combatants — Abu Khaled, a fighter for the Popular Front for the Liberation of Palestine, and Gilad Peled, an Israeli soldier — seeming, except for a little pixelation and rigid body movement, like flesh-and-blood people who are actually in the room with me.
  • ...11 more annotations...
  • What he saw there was a culture of warfare that often perpetuated itself through misunderstanding and misinformation, with no mechanism for those of opposing sects or political forces to gain a sense of the enemy as a fellow human being.
  • “I began to think, ‘I’m meeting the same people over and over again,’” he said. “I’m seeing people I knew as kids, and now they’re grown-up fighters, in power, fighting the same fight. And you start to think about your work in terms of: ‘Am I helping to change anything? Am I having any impact?’ ”
  • “I thought of myself as a war illustrator. I started calling myself that.”
  • as a visiting artist at the university’s Center for Art, Science and Technology, he transformed what he initially conceived of as an unconventional photo and testimonial project involving fighters into a far more unconventional way of hearing and seeing his subjects, hoping to be able to engender a form of empathy beyond the reach of traditional documentary film
  • Then he and a small crew captured three-dimensional scans of the men and photographed them from multiple angles
  • He interviewed Mr. Khaled in Gaza and Mr. Peled in Tel Aviv, asking them the same six questions — basic ones like “Who’s your enemy and why?”; “What is peace for you?”; “Have you ever killed one of your enemies?”; “Where do you see yourself in 20 years?”
  • he began to build avatars of his interviewees and ways for them to move and respond inside a virtual world so realistic it makes even a 3-D movie seem like an artifact from the distant past. Mr. Harrell describes it as “long-form journalism in a totally new form.”
  • “You have something here you don’t have in any other form of journalism: body language.”
  • indeed, inside the world they have made, the power comes from the feeling of listening to the interviewees speak (you hear Mr. Ben Khelifa’s disembodied voice asking the questions, and the men’s voices answer, overlaid by the voice of an interpreter) as your body viscerally senses a person standing a few feet away from you, his eyes following yours as he talks, his chest rising and falling as he breathes.
  • Sofia Ayala, an M.I.T. sophomore, tested the project after I did and emerged — as I did — with a mesmerized flush on her face, a feeling of meeting someone not really there. “It makes it feel so much more personal than just reading about these things online,” she said. “When someone’s right there talking to you, you want to listen.”
  • “In many places I’ve been, you’re given your enemy when you’re born,” he said. “You grow up with this ‘other’ always out there. The best we can hope is that the ‘other’ will now be able to come into the same room with you for a while, where you can listen to him, and see him face to face.”
Emily Freilich

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
kenjiendo

Impact Factor and the Future of Medical Journals - Haider Javed Warraich - The Atlantic - 0 views

    • kenjiendo
       
      An article highlighting recent criticism for the accuracy of published Medical Journals, origins of the issue, and possible solutions for the future. 
  • Impact Factor and the Future of Medical Journals Some research publications are getting away from flawed measures of influence that make it easy to game the system.
  • This year's Nobel Prize winner in physiology, Randy Scheckman, announced his decision to boycott the three major “luxury” journals: Science, Nature, and Cell.
  • ...19 more annotations...
  • medical journals are very rigid
  • impact factor, defined as the number of citations divided by the number of papers published in the journal, which is a measure to convey the influence of journals and the research they carry.
  • Journals employ several strategies to artificially raise the impact factor
  • caught trying to induce authors to increase the number of citations
  • cite each other’s articles
  • citations are barely a reflection of the quality of the research and that the impact factor is easily manipulated
  • shing’s growth is actually one of its g
  • overwhelmed with the avalanche of information
  • current system of peer-review, which originated in the 18th century, is now stressed
    • kenjiendo
       
      An example from our reading from U6-9
  • future of the medical journal was, he summed it up in just one word: “Digital.”
  • more innovative approache
  • PLOS One, which provides individual article metrics to anyone who accesses the article.
  • Instead of letting the reputation of the journal decide the impact of its papers, PLOS One provides information about the influence of the article on a more granular level.
  • future of medical publishing is a democratic one
  • Smart software will decide based on largely open access journals which papers will be of most interest to a particular reader.
  • Biology Direct, a journal that provides open peer review that is available for readers to read along with the article, with or without changes suggested by the reviewers.
  • Impact Factor and the Future of Medical Journals
  • Impact Factor and the Future of Medical Journals
Javier E

New Thinking and Old Books Revisited - NYTimes.com - 0 views

  • Mark Thoma’s classic crack — “I’ve learned that new economic thinking means reading old books” — has a serious point to it. We’ve had a couple of centuries of economic thought at this point, and quite a few smart people doing the thinking. It’s possible to come up with truly new concepts and approaches, but it takes a lot more than good intentions and casual observation to get there.
  • There is definitely a faction within economics that considers it taboo to introduce anything into its analysis that isn’t grounded in rational behavior and market equilibrium
  • what I do, and what everyone I’ve just named plus many others does, is a more modest, more eclectic form of analysis. You use maximization and equilibrium where it seems reasonably consistent with reality, because of its clarifying power, but you introduce ad hoc deviations where experience seems to demand them — downward rigidity of wages, balance-sheet constraints, bubbles (which are hard to predict, but you can say a lot about their consequences).
  • ...4 more annotations...
  • You may say that what we need is reconstruction from the ground up — an economics with no vestige of equilibrium analysis. Well, show me some results. As it happens, the hybrid, eclectic approach I’ve just described has done pretty well in this crisis, so you had better show me some really superior results before it gets thrown out the window.
  • if you think you’ve found a fundamental logical flaw in one of our workhorse economic models, the odds are very strong that you’ve just made a mistake.
  • it’s quite clear that the teaching of macroeconomics has gone seriously astray. As Saraceno says, the simple models that have proved so useful since 2008 are by and large taught only at the undergrad level — they’re treated as too simple, too ad hoc, whatever, to make it into the grad courses even at places that aren’t very ideological.
  • to temper your modeling with a sense of realism you need to know something about reality — and not just the statistical properties of U.S. time series since 1947. Economic history — global economic history — should be a core part of the curriculum. Nobody should be making pronouncements on macro without knowing a fair bit about the collapse of the gold standard in the 1930s, what actually happened in the stagflation of the 1970s, the Asian financial crisis of the 90s, and, looking forward, the euro crisis.
Javier E

Biker Gangs, Tamir Rice, And The Rise Of White Fragility - 0 views

  • The most dangerous uprising that's threatening America's stability isn't black protests in places like Ferguson or Baltimore. It's taking place among an aging white majority that is losing its bearing on reality and destroying the gears of government, media and public welfare. At its center is an inexplicable, illogical and dangerous fear that some sociologists are now defining as white fragility.
  • In her 2011 academic pedagogical analysis titled “White Fragility,” DiAngelo goes into a detailed explanation of how white people in North America live in insulated social and media spaces that protect them from any race-based stress. This privileged fragility leaves them unable to tolerate any schism or challenge to a universally accepted belief system. Any shift away from that (like a biracial African-American president) triggers a deep and sustaining panic. Racial segregation, disproportionate representation in the media, and many other factors serve as the columns that support white fragility
  • misunderstanding was caused by misidentification of what white privilege and power means. Privilege doesn’t mean automatic wealth and health. What “white privilege” means is that society is rooting for one particular segment of the population to succeed over all others, and has installed a disproportionately high amount of institutional and psychological helpers every step of the way.
  • ...8 more annotations...
  • “Part of white fragility is to assume that when we talk about racism, we are calling someone out as being individually a racist,” he said. “So if you say we're going to talk about racism, white people think you're going to call them a name. But for most people of color it's a system. And we're talking about dealing with a structure so the real problem is the system.”
  • When separate groups of people are using the same word with different implied meanings then problems will persist.
  • When it comes to racism and increased segregation, both Wise and DiAngelo noted that there seems to be this rigid unwillingness to address any inequality, because it would upset the very people who are both benefiting from the injustice and refusing to acknowledge its existence.
  • The fear is that if someone seeks to define and fix racism, many white people feel like they’re being directly attacked. So instead of waiting for the attack, white fragility promotes protection by putting punitive restrictions on “the others.”
  • The Obama era has been an interesting petri dish of white fragility. On the heels of a moderate economic recovery, we’ve seen sweeping new state laws aimed at social issues: voting rights restrictions, defunding of Planned Parenthood, anti-gay legislation, Stand Your Ground bills, and restrictive union laws to weaken their bargaining power. These laws have resulted in a rollback of rights for minorities, women, the LGBT movement, and the working class.
  • The strangest thing about white fragility politics is that the detrimental policy results are spread out across race and class. Yet, the political results for the conservative movement priming the pump of white fragility and rage is election victories. And why should they change when they can get large sections of an aging white population to consistently vote for policies proven to statistically hurt their economic chances, personal health, their children’s education, and their very safety?
  • These are not rational decisions. These are fear-based politics that create avoidable disasters in which all suffer. This new wave of segregation fear is surging across the country. In response to the continued white fragility panic of 2008, conservative political movements are set to capitalize on the cycles of manufactured hysteria. “We are watching the repeal of the 20th century,” Wise said.
  • When I asked Wise and DiAngelo to give me something hopeful for the future, they both gave me a bleak picture. When I suggested that more facts and evidence could sway people, they disagreed. “People who are deeply committed to a world view don’t change their opinions when confronted with new facts,” Wise said. “Oddly enough, new facts cause them to dig in more deeply.”
Javier E

A Meditation on the Art of Not Trying - NYTimes.com - 0 views

  • It’s the default prescription for any tense situation: a blind date, a speech, a job interview, the first dinner with the potential in-laws. Relax. Act natural. Just be yourself. But when you’re nervous, how can you be yourself?
  • Edward Slingerland. He has developed, quite deliberately, a theory of spontaneity based on millenniums of Asian philosophy and decades of research by psychologists and neuroscientists.
  • He calls it the paradox of wu wei, the Chinese term for “effortless action.”
  • ...18 more annotations...
  • Wu wei is integral to romance, religion, politics and commerce. It’s why some leaders have charisma and why business executives insist on a drunken dinner before sealing a deal.
  • the quest for wu wei has been going on ever since humans began living in groups larger than hunter-gathering clans. Unable to rely on the bonds of kinship, the first urban settlements survived by developing shared values, typically through religion, that enabled people to trust one another’s virtue and to cooperate for the common good.
  • But there was always the danger that someone was faking it and would make a perfectly rational decision to put his own interest first if he had a chance to shirk his duty.
  • To be trusted, it wasn’t enough just to be a sensible, law-abiding citizen, and it wasn’t even enough to dutifully strive to be virtuous. You had to demonstrate that your virtue was so intrinsic that it came to you effortlessly.
  • the discovery in 1993 of bamboo strips in a tomb in the village of Guodian in central China. The texts on the bamboo, composed more than three centuries before Christ, emphasize that following rules and fulfilling obligations are not enough to maintain social order.
  • These texts tell aspiring politicians that they must have an instinctive sense of their duties to their superiors: “If you try to be filial, this not true filiality; if you try to be obedient, this is not true obedience. You cannot try, but you also cannot not try.”
  • is that authentic wu wei? Not according to the rival school of Taoists that arose around the same time as Confucianism, in the fifth century B.C. It was guided by the Tao Te Ching, “The Classic of the Way and Virtue,” which took a direct shot at Confucius: “The worst kind of Virtue never stops striving for Virtue, and so never achieves Virtue.”
  • Through willpower and the rigorous adherence to rules, traditions and rituals, the Confucian “gentleman” was supposed to learn proper behavior so thoroughly that it would eventually become second nature to him.
  • Taoists did not strive. Instead of following the rigid training and rituals required by Confucius, they sought to liberate the natural virtue within. They went with the flow. They disdained traditional music in favor of a funkier new style with a beat. They emphasized personal meditation instead of formal scholarship.
  • Variations of this debate would take place among Zen Buddhist, Hindu and Christian philosophers, and continue today among psychologists and neuroscientists arguing how much of morality and behavior is guided by rational choices or by unconscious feelings.
  • “Psychological science suggests that the ancient Chinese philosophers were genuinely on to something,” says Jonathan Schooler, a psychologist at the University of California, Santa Barbara. “Particularly when one has developed proficiency in an area, it is often better to simply go with the flow. Paralysis through analysis and overthinking are very real pitfalls that the art of wu wei was designed to avoid.”
  • Before signing a big deal, businesspeople often insist on getting to know potential partners at a boozy meal because alcohol makes it difficult to fake feelings.
  • Some people, like politicians and salespeople, can get pretty good at faking spontaneity, but we’re constantly looking for ways to expose them.
  • However wu wei is attained, there’s no debate about the charismatic effect it creates. It conveys an authenticity that makes you attractive, whether you’re addressing a crowd or talking to one person.
  • what’s the best strategy for wu wei — trying or not trying? Dr. Slingerland recommends a combination. Conscious effort is necessary to learn a skill, and the Confucian emphasis on following rituals is in accord with psychological research showing we have a limited amount of willpower. Training yourself to follow rules automatically can be liberating, because it conserves cognitive energy for other tasks.
  • He likes the compromise approach of Mencius, a Chinese philosopher in the fourth century B.C. who combined the Confucian and Taoist approaches: Try, but not too hard.
  • “But in many domains actual success requires the ability to transcend our training and relax completely into what we are doing, or simply forget ourselves as agents.”
  • The sprouts were Mencius’ conception of wu wei: Something natural that requires gentle cultivation. You plant the seeds and water the sprouts, but at some point you need to let nature take its course. Just let the sprouts be themselves.
Javier E

History News Network | How the NCSS Sold Out Social Studies and History - 0 views

  • As a historian, I was influenced by E. H. Carr’s thinking about the past and present as part of a continuum that stretches into the future. In What is History? (1961), Carr argued that concern with the future is what really motivates the study of the past. As a teacher and political activist, I also strongly believe that schools should promote active citizenship as essential for maintaining a democratic society.
  • in an effort to survive, the NCSS has largely abandoned its commitment to these ideas, twisting itself into a pretzel to adapt to national Common Core standards and to satisfy influential conservative organizations that they are not radical, or even liberal. I suspect, but cannot document, that the organization’s membership has precipitously declined during the past two decades and it has increasingly depended financial support for its conferences and publications from deep-pocketed traditional and rightwing groups who advertise and have display booths.
  • No Child Left Behind (NCLB). Since the introduction of NCLB, there has been a steady reduction in the amount of time spent in the teaching of social studies, with the most profound decline noticed in the elementary grades.”
  • ...10 more annotations...
  • In an effort to counter the Common Core push for detextualized skill-based instruction and assessment that has further marginalized social studies education, the NCSS is promoting what it calls “College, Career, and Civic Life (C3) Framework,”
  • through its choice of partners, its rigid adherence to Common Core lesson guidelines, and the sample material it is promoting, the NCSS has virtually abandoned not just meaningful social studies education, but education for democracy and citizenship as well.
  • My biggest problem with the C3 Framework as presented in this new document on instruction is its attempt to adopt a fundamentally flawed Common Core approach to social studies and history based on the close reading of text without exploring historical context
  • how Common Core as it is being implemented will mean the end of history.
  • In the C3 Framework inquiry approach, students start in Dimension 1 by “developing questions and planning inquiries,” however the inquiry is really already “planned” because material they use is pre-selected. It is also not clear what their questions will be based on since they do not necessarily have any background on the subject. In Dimension 3 students evaluate sources using evidence, but again, devoid of historical context. Dimension 4 is supposed to be C3’s chief addition to Common Core. In Dimension 4 students are supposed to plan activities and become involved in civic life, although of course their options have again already been pre-prescribed.
  • In Dimension 2, as they read the text, which sixth graders and many eighth graders will find difficult, students discuss “How was citizenship revolutionary in 1776?” The question requires them to assume that colonists had already formulated a concept of citizenship, which I do not believe they had, a concept of nation, which they definitely did not, and an understanding that somehow what they were doing was “revolutionary,” which was still being debated.
  • Some of the organizations involved in writing the C3 Frameworks have positions so politically skewed to the right that NCSS should be embarrassed about including them in the project. In this category I include Gilder Lehrman, The Bill of Rights Institute, and The Center for Economic Education and Entrepreneurship (CEEE).
  • Conspicuously missing from the group of contributors is the Zinn Education Project which would have provided a radically different point of view.
  • What we have in the C3 Framework is standard teaching at best but a lot of poor teaching and propaganda as well.
  • Instead of challenging Common Core, the NCSS begs to be included. Instead of presenting multiple perspectives, it sells advertising in the form of lessons to its corporate and foundation sponsors. But worst in their own terms, in a time of mass protest against police brutality by high school and college students across the United States, active citizenship in a democratic society is stripped of meaning and becomes little more than idle discussion and telling student to vote when they are eighteen.
Javier E

The Mental Virtues - NYTimes.com - 0 views

  • Even if you are alone in your office, you are thinking. Thinking well under a barrage of information may be a different sort of moral challenge than fighting well under a hail of bullets, but it’s a character challenge nonetheless.
  • some of the cerebral virtues. We can all grade ourselves on how good we are at each of them.
  • love of learning. Some people are just more ardently curious than others, either by cultivation or by nature.
  • ...12 more annotations...
  • courage. The obvious form of intellectual courage is the willingness to hold unpopular views. But the subtler form is knowing how much risk to take in jumping to conclusions.
  • Intellectual courage is self-regulation, Roberts and Wood argue, knowing when to be daring and when to be cautious. The philosopher Thomas Kuhn pointed out that scientists often simply ignore facts that don’t fit with their existing paradigms, but an intellectually courageous person is willing to look at things that are surprisingly hard to look at.
  • The median point between flaccidity and rigidity is the virtue of firmness. The firm believer can build a steady worldview on solid timbers but still delight in new information. She can gracefully adjust the strength of her conviction to the strength of the evidence. Firmness is a quality of mental agility.
  • humility, which is not letting your own desire for status get in the way of accuracy. The humble person fights against vanity and self-importance.
  • wisdom isn’t a body of information. It’s the moral quality of knowing how to handle your own limitations.
  • autonomy
  • Autonomy is the median of knowing when to bow to authority and when not to, when to follow a role model and when not to, when to adhere to tradition and when not to.
  • generosity. This virtue starts with the willingness to share knowledge and give others credit. But it also means hearing others as they would like to be heard, looking for what each person has to teach and not looking to triumphantly pounce upon their errors.
  • thinking well means pushing against the grain of our nature — against vanity, against laziness, against the desire for certainty, against the desire to avoid painful truths. Good thinking isn’t just adopting the right technique. It’s a moral enterprise and requires good character, the ability to go against our lesser impulses for the sake of our higher ones.
  • The humble researcher doesn’t become arrogant toward his subject, assuming he has mastered it. Such a person is open to learning from anyone at any stage in life.
  • Warren Buffett made a similar point in his own sphere, “Investing is not a game where the guy with the 160 I.Q. beats the guy with the 130 I.Q. Once you have ordinary intelligence, what you need is the temperament to control the urges that get other people into trouble.”
  • Good piece. I only wish David had written more about all the forces that work _against_ the virtues he describes. The innumerable examples of corporate suppression/spin of "inconvenient" truths (i.e, GM, Toyota, et al); the virtual acceptance that lying is a legitimate tactic in political campaigns; our preoccupation with celebrity, appearances, and "looking good" in every imaginable transaction; make the quiet virtues that DB describes even more heroic than he suggests.
Javier E

The Trick to Being More Virtuous - NYTimes.com - 1 views

  • Psychologists study a phenomenon called “moral elevation,” an emotional state that leads us to act virtuously when exposed to the virtue of others. In experiments, participants who are brought face to face with others’ gratitude or giving behavior are more likely to display those virtues themselves.
  • We can be the passive beneficiaries of moral elevation. But we can actively pursue it as well by rejecting bad influences and seeking good ones. We can even create the circumstances for moral elevation ourselves.
  • If we want to grow in virtue, and experience a healthier, more productive political environment, each of us must demand more virtue.
  • ...4 more annotations...
  • We get more of what we signal we want through our dollars, clicks and votes. If our politics are too often poisonous, it is because, as a society, we are demanding too much poison.
  • We should ask ourselves: What will my next click say about my desires? Will the next article about politics I read elevate me? Or will it be a pathogen that provides momentary satisfaction from an eloquent insult to my enemies, but ultimately fuels personal bitterness and increases the climate of acrimony in America?
  • what do we really demand of the politicians we support? Humility, optimism and flexibility? Or do we excuse our own side for its ideological rigidity, preening self-regard and blame-shifting?
  • The next two years are a challenge to our political leaders, yes — but also to us, to demand a climate of moral elevation as opposed to destruction of the other side.
Javier E

The Joy of Psyching Myself Out­ - The New York Times - 0 views

  • that neat separation is not just unwarranted; it’s destructive
  • Although it’s often presented as a dichotomy (the apparent subjectivity of the writer versus the seeming objectivity of the psychologist), it need not be.
  • IS it possible to think scientifically and creatively at once? Can you be both a psychologist and a writer?
  • ...10 more annotations...
  • “A writer must be as objective as a chemist,” Anton Chekhov wrote in 1887. “He must abandon the subjective line; he must know that dung heaps play a very reasonable part in a landscape.”
  • At the turn of the century, psychology was a field quite unlike what it is now. The theoretical musings of William James were the norm (a wry commenter once noted that William James was the writer, and his brother Henry, the psychologist)
  • Freud was a breed of psychologist that hardly exists anymore: someone who saw the world as both writer and psychologist, and for whom there was no conflict between the two. That boundary melding allowed him to posit the existence of cognitive mechanisms that wouldn’t be empirically proved for decades,
  • Freud got it brilliantly right and brilliantly wrong. The rightness is as good a justification as any of the benefits, the necessity even, of knowing how to look through the eyes of a writer. The wrongness is part of the reason that the distinction between writing and experimental psychology has grown far more rigid than it was a century ago.
  • the signs people associate with liars often have little empirical evidence to support them. Therein lies the psychologist’s distinct role and her necessity. As a writer, you look in order to describe, but you remain free to use that description however you see fit. As a psychologist, you look to describe, yes, but also to verify.
  • Without verification, we can’t always trust what we see — or rather, what we think we see.
  • The desire for the world to be what it ought to be and not what it is permeates experimental psychology as much as writing, though. There’s experimental bias and the problem known in the field as “demand characteristics” — when researchers end up finding what they want to find by cuing participants to act a certain way.
  • IN 1932, when he was in his 70s, Freud gave a series of lectures on psychoanalysis. In his final talk, “A Philosophy of Life,” he focused on clarifying an important caveat to his research: His followers should not be confused by the seemingly internal, and thus possibly subjective, nature of his work. “There is no other source of knowledge of the universe but the intellectual manipulation of carefully verified observations,” he said.
  • That is what both the psychologist and the writer should strive for: a self-knowledge that allows you to look in order to discover, without agenda, without preconception, without knowing or caring if what you’re seeing is wrong or right in your scheme of the world. It’s harder than it sounds. For one thing, you have to possess the self-knowledge that will allow you to admit when you’re wrong.
  • Even with the best intentions, objectivity can prove a difficult companion. I left psychology behind because I found its structural demands overly hampering. I couldn’t just pursue interesting lines of inquiry; I had to devise a set of experiments, see how feasible they were, both technically and financially, consider how they would reflect on my career. That meant that most new inquiries never happened — in a sense, it meant that objectivity was more an ideal than a reality. Each study was selected for a reason other than intrinsic interest.
Javier E

Geology's Timekeepers Are Feuding - The Atlantic - 0 views

  • , in 2000, the Nobel Prize-winning chemist Paul Crutzen won permanent fame for stratigraphy. He proposed that humans had so throughly altered the fundamental processes of the planet—through agriculture, climate change, and nuclear testing, and other phenomena—that a new geological epoch had commenced: the Anthropocene, the age of humans.
  • Zalasiewicz should know. He is the chair of the Anthropocene working group, which the ICS established in 2009 to investigate whether the new epoch deserved a place in stratigraphic time.
  • In 2015, the group announced that the Anthropocene was a plausible new layer and that it should likely follow the Holocene. But the team has yet to propose a “golden spike” for the epoch: a boundary in the sedimentary rock record where the Anthropocene clearly begins.
  • ...12 more annotations...
  • Officially, the Holocene is still running today. You have lived your entire life in the Holocene, and the Holocene has constituted the geological “present” for as long as there have been geologists.But if we now live in a new epoch, the Anthropocene, then the ICS will have to chop the Holocene somewhere. It will have to choose when the Holocene ended, and it will move some amount of time out of the purview of the Holocene working group and into that of the Anthropocene working group.
  • This is politically difficult. And right now, the Anthropocene working group seems intent on not carving too deep into the Holocene. In a paper published earlier this year in Earth-Science Reviews, the Anthropocene working group’s members strongly imply that they will propose starting the new epoch in the mid-20th century.
  • Some geologists argue that the Anthropocene started even earlier: perhaps 4,000 or 6,000 years ago, as farmers began to remake the land surface.“Most of the world’s forests that were going to be converted to cropland and agriculture were already cleared well before 1950,” says Bill Ruddiman, a geology professor at the University of Virginia and an advocate of this extremely early Anthropocene.
  • “Most of the world’s prairies and steppes that were going to be cleared for crops were already gone, by then. How can you argue the Anthropocene started in 1950 when all of the major things that affect Earth’s surface were already over?”Van der Pluijm agreed that the Anthropocene working group was picking 1950 for “not very good reasons.”“Agriculture was the revolution that allowed society to develop,” he said. “That was really when people started to force the land to work for them. That massive land movement—it’s like a landslide, except it’s a humanslide. And it is not, of course, as dramatic as today’s motion of land, but it starts the clock.”
  • This muddle had to stop. The Holocene comes up constantly in discussions of modern global warming. Geologists and climate scientists did not make their jobs any easier by slicing it in different ways and telling contradictory stories about it.
  • This process started almost 10 years ago. For this reason, Zalasiewicz, the chair of the Anthropocene working group, said he wasn’t blindsided by the new subdivisions at all. In fact, he voted to adopt them as a member of the Quaternary working group.“Whether the Anthropocene works with a unified Holocene or one that’s in three parts makes for very little difference,” he told me.In fact, it had made the Anthropocene group’s work easier. “It has been useful to compare the scale of the two climate events that mark the new boundaries [within the Holocene] with the kind of changes that we’re assessing in the Anthropocene. It has been quite useful to have the compare and contrast,” he said. “Our view is that some of the changes in the Anthropocene are rather bigger.”
  • Zalasiewicz said that he and his colleagues were going as fast as they could. When the working group group began its work in 2009, it was “really starting from scratch,” he told me.While other working groups have a large body of stratigraphic research to consider, the Anthropocene working group had nothing. “We had to spend a fair bit of time deciding whether the Anthropocene was geology at all,” he said. Then they had to decide where its signal could show up. Now, they’re looking for evidence that shows it.
  • This cycle of “glacials” and “interglacials” has played out about 50 times over the last several million years. When the Holocene began, it was only another interglacial—albeit the one we live in. Until recently, glaciers were still on schedule to descend in another 30,000 years or so.Yet geologists still call the Holocene an epoch, even though they do not bestow this term on any of the previous 49 interglacials. It get special treatment because we live in it.
  • Much of this science is now moot. Humanity’s vast emissions of greenhouse gas have now so warmed the climate that they have offset the next glaciation. They may even knock us out of the ongoing cycle of Ice Ages, sending the Earth hurtling back toward a “greenhouse” climate after the more amenable “icehouse” climate during which humans evolved.For this reason, van der Pluijm wants the Anthropocene to supplant the Holocene entirely. Humans made their first great change to the environment at the close of the last glaciation, when they seem to have hunted the world’s largest mammals—the wooly mammoth, the saber-toothed tiger—to extinction. Why not start the Anthropocene then?He would even rename the pre-1800 period “the Holocene Age” as a consolation prize:
  • Zalasiewicz said he would not start the Anthropocene too early in time, as it would be too work-intensive for the field to rename such a vast swath of time. “The early-Anthropocene idea would crosscut against the Holocene as it’s seen by Holocene workers,” he said. If other academics didn’t like this, they could create their own timescales and start the Anthropocene Epoch where they choose. “We have no jurisdiction over the word Anthropocene,” he said.
  • Ruddiman, the University of Virginia professor who first argued for a very early Anthropocene, now makes an even broader case. He’s not sure it makes sense to formally define the Anthropocene at all. In a paper published this week, he objects to designating the Anthropocene as starting in the 1950s—and then he objects to delineating the Anthropocene, or indeed any new geological epoch, by name. “Keep the use of the term informal,” he told me. “Don’t make it rigid. Keep it informal so people can say the early-agricultural Anthropocene, or the industrial-era Anthropocene.”
  • “This is the age of geochemical dating,” he said. Geologists have stopped looking to the ICS to place each rock sample into the rock sequence. Instead, field geologists use laboratory techniques to get a precise year or century of origin for each rock sample. “The community just doesn’t care about these definitions,” he said.
anonymous

Saudi Arabia's crown prince is making a lot of enemies (opinion) - CNN - 0 views

  • Saudi Arabia's Prince Mohammed bin Salman, first in line to inherit the throne from his 81-year-old father, is not a patient man. The 32-year-old is driving a frenetic pace of change in pursuit of three goals: securing his hold on power, transforming Saudi Arabia into a very different country, and pushing back against Iran.
  • In the two years since his father ascended the throne, this favorite son of King Salman bin Abdulaziz has been spectacularly successful at achieving the first item on his agenda. He has become so powerful so fast that observers can hardly believe how brazenly he is dismantling the old sedate system of family consensus, shared privilege and rigid ultraconservatism.
  • He has vowed to improve the status of women, announcing that the ban on women driving will be lifted next year, and limiting the scope of the execrable "guardianship" system, which treats women like children, requiring permission from male guardians for basic activities. He has also restrained the despised religious police. And just last month he called for a return to a "moderate Islam open to the world and all religions," combating extremism and empowering its citizens.
  • ...1 more annotation...
  • With so many enemies, the crown prince needs to produce more than a vision, he needs to show tangible results. The days of a quiet, patient Saudi Arabia are now over.
anonymous

Opinion | I Survived 18 Years in Solitary Confinement - The New York Times - 0 views

  • I Survived 18 Years in Solitary Confinement
  • Mr. Manuel is an author, activist and poet. When he was 14 years old, he was sentenced to life in prison with no parole and spent 18 years in solitary confinement.
  • Imagine living alone in a room the size of a freight elevator for almost two decades.
  • ...33 more annotations...
  • As a 15-year-old, I was condemned to long-term solitary confinement in the Florida prison system, which ultimately lasted for 18 consecutive years
  • From age 15 to 33.
  • For 18 years I didn’t have a window in my room to distract myself from the intensity of my confinement
  • I wasn’t permitted to talk to my fellow prisoners or even to myself. I didn’t have healthy, nutritious food; I was given just enough to not die
  • These circumstances made me think about how I ended up in solitary confinement.
  • United Nations standards on the treatment of prisoners prohibits solitary confinement for more than 15 days, declaring it “cruel, inhuman or degrading.”
  • For this I was arrested and charged as an adult with armed robbery and attempted murder.
  • My court-appointed lawyer advised me to plead guilty, telling me that the maximum sentence would be 15 years. So I did. But my sentence wasn’t 15 years — it was life imprisonment without the possibility of parole.
  • But a year and a half later, at age 15, I was put back into solitary confinement after being written up for a few minor infractions.
  • Florida has different levels of solitary confinement; I spent the majority of that time in one of the most restrictive
  • I was finally released from prison in 2016 thanks to my lawyer, Bryan Stevenson
  • Researchers have long concluded that solitary confinement causes post-traumatic stress disorder and impairs prisoners’ ability to adjust to society long after they leave their cell.
  • In the summer of 1990, shortly after finishing seventh grade, I was directed by a few older kids to commit a robbery. During the botched attempt, I shot a woman. She suffered serious injuries to her jaw and mouth but survived. It was reckless and foolish on my part, the act of a 13-year-old in crisis, and I’m simply grateful no one died.
  • More aggressive change is needed in state prison systems
  • In 2016, the Obama administration banned juvenile solitary confinement in federal prisons, and a handful of states have advanced similar reforms for both children and adults.
  • Yet the practice, even for minors, is still common in the United States, and efforts to end it have been spotty
  • Because solitary confinement is hidden from public view and the broader prison population, egregious abuses are left unchecked
  • I watched a corrections officer spray a blind prisoner in the face with chemicals simply because he was standing by the door of his cell as a female nurse walked by. The prisoner later told me that to justify the spraying, the officer claimed the prisoner masturbated in front of the nurse.
  • I also witnessed the human consequences of the harshness of solitary firsthand: Some people would resort to cutting their stomachs open with a razor and sticking a plastic spork inside their intestines just so they could spend a week in the comfort of a hospital room with a television
  • On occasion, I purposely overdosed on Tylenol so that I could spend a night in the hospital. For even one night, it was worth the pain.
  • Another time, I was told I’d be switching dorms, and I politely asked to remain where I was because a guard in the new area had been overly aggressive with me. In response, four or five officers handcuffed me, picked me up by my feet and shoulders, and marched with me to my new dorm — using my head to ram the four steel doors on the way there.
  • The punishments were wholly disproportionate to the infractions. Before I knew it, months in solitary bled into years, years into almost two decades.
  • As a child, I survived these conditions by conjuring up stories of what I’d do when I was finally released. My mind was the only place I found freedom from my reality
  • the only place I could play basketball with my brother or video games with my friends, and eat my mother’s warm cherry pie on the porch.
  • No child should have to use their imagination this way — to survive.
  • It is difficult to know the exact number of children in solitary confinement today. The Liman Center at Yale Law School estimated that 61,000 Americans (adults and children) were in solitary confinement in the fall of 2017
  • No matter the count, I witnessed too many people lose their minds while isolated. They’d involuntarily cross a line and simply never return to sanity. Perhaps they didn’t want to. Staying in their mind was the better, safer, more humane option.
  • Solitary confinement is cruel and unusual punishment, something prohibited by the Eighth Amendment, yet prisons continue to practice it.
  • When it comes to children, elimination is the only moral option. And if ending solitary confinement for adults isn’t politically viable, public officials should at least limit the length of confinement to 15 days or fewer, in compliance with the U.N. standards
  • As I try to reintegrate into society, small things often awaken painful memories from solitary. Sometimes relationships feel constraining. It’s difficult to maintain the attention span required for a rigid 9-to-5 job. At first, crossing the street and seeing cars and bikes racing toward me felt terrifying.
  • I will face PTSD and challenges big and small for the rest of my life because of what I was subjected to.
  • And some things I never will — most of all, that this country can treat human beings, especially children, as cruelly as I was treated.
  • Sadly, solitary confinement for juveniles is still permissible in many states. But we have the power to change that — to ensure that the harrowing injustice I suffered as a young boy never happens to another child in America.
  •  
    A very eye-opening article and story told by a victim about young children facing solitary confinement.
caelengrubb

Your Ability to Can Even: A Defense of Internet Linguistics - The Toast - 0 views

  • When the new grammatical structures and phrases express something that conventional language simply cannot
  • this new grammar-bending, punctuation-erasing, verb-into-noun-turning, key-board-smashing linguistic convention doesn’t dominate the whole Interne
  • language generated on Tumblr is is now becoming Facebook and Twitter language and influencing language everywhere from Buzzfeed to Autostraddle.
  • ...10 more annotations...
  • The linguistic study of the Internet is a very young field but it does, in fact, exis
  • Conventional wisdom portrays this form of linguistic flexibility and playfulness as the end of intelligent human life. The Internet has been blamed for making children illiterate, making adults stupid and generally tarnishing the state of modern discourse.
  • Not only are these allegations not true. David Crystal’s research actually points to the opposite.
  • Those who use technology read more on a day-to-day basis than non-tech users and are, therefore, faster and better readers.
  • The backlash confirms the emergence of Internet Language as a fairly serious development, if not a very small and vibrant written dialect
  • Dialects are characterized as deviations from the “standard” version of a given language and are often dismissed due to their lack prestige by standard users of the language
  • The fact is, the type of language that is being created online is affecting day-to-day speech patterns and writing styles of most young adults
  • Dialects develop when people with a distinct cultural and linguistic heritage run up against a rigid and unfamiliar system, usually by immigrating to a new country. It becomes necessary to develop a way to retain old linguistic features while adopting new ones in order to able to communicate.
  • Men and women on the Internet use many of the same tropes, enthusiasm markers and emphasizers in order to communicate. In the world of blogging and Internet writing, women are the creators of language
  • But the Internet Language phenomenon is just as much sociological as it is sociolinguistic: we are just as shaped by language as it is shaped by us.
1 - 20 of 29 Next ›
Showing 20 items per page