Skip to main content

Home/ TOK Friends/ Group items tagged analog

Rss Feed Group items tagged

Javier E

Science: A New Map of the Human Brain - WSJ.com - 0 views

  • The popular left/right story has no solid basis in science. The brain doesn't work one part at a time, but rather as a single interactive system, with all parts contributing in concert, as neuroscientists have long known. The left brain/right brain story may be the mother of all urban legends: It sounds good and seems to make sense—but just isn't true.
  • There is a better way to understand the functioning of the brain, based on another, ordinarily overlooked anatomical division—between its top and bottom parts. We call this approach "the theory of cognitive modes." Built on decades of unimpeachable research that has largely remained inside scientific circles, it offers a new way of viewing thought and behavior
  • Our theory has emerged from the field of neuropsychology, the study of higher cognitive functioning—thoughts, wishes, hopes, desires and all other aspects of mental life. Higher cognitive functioning is seated in the cerebral cortex, the rind-like outer layer of the brain that consists of four lobes
  • ...19 more annotations...
  • The top brain comprises the entire parietal lobe and the top (and larger) portion of the frontal lobe. The bottom comprises the smaller remainder of the frontal lobe and all of the occipital and temporal lobes.
  • research reveals that the top-brain system uses information about the surrounding environment (in combination with other sorts of information, such as emotional reactions and the need for food or drink) to figure out which goals to try to achieve. It actively formulates plans, generates expectations about what should happen when a plan is executed and then, as the plan is being carried out, compares what is happening with what was expected, adjusting the plan accordingly.
  • The bottom-brain system organizes signals from the senses, simultaneously comparing what is being perceived with all the information previously stored in memory. It then uses the results of such comparisons to classify and interpret the object or event, allowing us to confer meaning on the world.
  • The top- and bottom-brain systems always work together, just as the hemispheres always do. Our brains are not engaged in some sort of constant cerebral tug of war
  • Although the top and bottom parts of the brain are always used during all of our waking lives, people do not rely on them to an equal degree. To extend the bicycle analogy, not everyone rides a bike the same way. Some may meander, others may race.
  • You can use the top-brain system to develop simple and straightforward plans, as required by a situation—or you have the option to use it to develop detailed and complex plans (which are not imposed by a situation).
  • Our theory predicts that people fit into one of four groups, based on their typical use of the two brain systems. Depending on the degree to which a person uses the top and bottom systems in optional ways, he or she will operate in one of four cognitive modes: Mover, Perceiver, Stimulator and Adaptor.
  • Mover mode results when the top- and bottom-brain systems are both highly utilized in optional ways. Oprah Winfrey
  • According to the theory, people who habitually rely on Mover mode are most comfortable in positions that allow them to plan, act and see the consequences of their actions. They are well suited to being leaders.
  • Perceiver mode results when the bottom-brain system is highly utilized in optional ways but the top is not. Think of the Dalai Lama or Emily Dickinson
  • People who habitually rely on Perceiver mode try to make sense in depth of what they perceive; they interpret their experiences, place them in context and try to understand the implications.
  • such people—including naturalists, pastors, novelists—typically lead lives away from the limelight. Those who rely on this mode often play a crucial role in a group; they can make sense of events and provide a bigger picture
  • Stimulator mode, which results when the top-brain system is highly utilized but the bottom is not. According to our theory, people who interact with the world in Stimulator mode often create and execute complex and detailed plans (using the top-brain system) but fail to register consistently and accurately the consequences of acting on those plans
  • they may not always note when enough is enough. Their actions can be disruptive, and they may not adjust their behavior appropriately.
  • Examples of people who illustrate Stimulator mode would include Tiger Woods
  • Adaptor mode, which results when neither the top- nor the bottom-brain system is highly utilized in optional ways. People who think in this mode are not caught up in initiating plans, nor are they fully focused on classifying and interpreting what they experience. Instead, they become absorbed by local events and the immediate requirements of the situation
  • They are responsive and action-oriented and tend to "go with the flow." Others see them as free-spirited and fun to be with.
  • those who typically operate in Adaptor mode can be valuable team members. In business, they often form the backbone of an organization, carrying out essential operations.
  • No one mode is "better" than the others. Each is more or less useful in different circumstances, and each contributes something useful to a team. Our theory leads us to expect that you can work with others most productively when you are aware not just of the strengths and weakness of their preferred modes but also of the strengths and weakness of your own preferred mode
jlessner

Why Facebook's News Experiment Matters to Readers - NYTimes.com - 0 views

  • Facebook’s new plan to host news publications’ stories directly is not only about page views, advertising revenue or the number of seconds it takes for an article to load. It is about who owns the relationship with readers.
  • It’s why Google, a search engine, started a social network and why Facebook, a social network, started a search engine. It’s why Amazon, a shopping site, made a phone and why Apple, a phone maker, got into shopping.
  • Facebook’s experiment, called instant articles, is small to start — just a few articles from nine media companies, including The New York Times. But it signals a major shift in the relationship between publications and their readers. If you want to read the news, Facebook is saying, come to Facebook, not to NBC News or The Atlantic or The Times — and when you come, don’t leave. (For now, these articles can be viewed on an iPhone running the Facebook app.)
  • ...6 more annotations...
  • The front page of a newspaper and the cover of a magazine lost their dominance long ago.
  • But news reports, like albums before them, have not been created that way. One of the services that editors bring to readers has been to use their news judgment, considering a huge range of factors, when they decide how articles fit together and where they show up. The news judgment of The New York Times is distinct from that of The New York Post, and for generations readers appreciated that distinction.
  • “In digital, every story becomes unbundled from each other, so if you’re not thinking of each story as living on its own, it’s tying yourself back to an analog era,” Mr. Kim said.
  • Facebook executives have insisted that they intend to exert no editorial control because they leave the makeup of the news feed to the algorithm. But an algorithm is not autonomous. It is written by humans and tweaked all the time. Advertisement Continue reading the main story Advertisement Continue reading the main story
  • That raises some journalistic questions. The news feed algorithm works, in part, by showing people more of what they have liked in the past. Some studies have suggested that means they might not see as wide a variety of news or points of view, though others, including one by Facebook researchers, have found they still do.
  • Tech companies, Facebook included, are notoriously fickle with their algorithms. Publications became so dependent on Facebook in the first place because of a change in its algorithm that sent more traffic their way. Later, another change demoted articles from sites that Facebook deemed to run click-bait headlines. Then last month, Facebook decided to prioritize some posts from friends over those from publications.
Javier E

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
Javier E

The Science of Why We Don't Believe Science | Mother Jones - 2 views

  • an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called "motivated reasoning" helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, "death panels," the birthplace and religion of the president (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
  • The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call "affect"). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a "basic human survival skill," explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
  • reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
  • ...5 more annotations...
  • Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. "They retrieve thoughts that are consistent with their previous beliefs," says Taber, "and that will lead them to build an argument and challenge what they're hearing."
  • In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers
  • Our "reasoning" is a means to a predetermined end—winning our "case"—and is shot through with biases. They include "confirmation bias," in which we give greater heed to evidence and arguments that bolster our beliefs, and "disconfirmation bias," in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.
  • That's not to suggest that we aren't also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It's just that we have other important goals b
  • esides accuracy—including identity affirmation and protecting one's sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.
grayton downing

Measuring Consciousness | The Scientist Magazine® - 0 views

  • General anesthesia has transformed surgery from a ghastly ordeal to a procedure in which the patient feels no pain.
  • “integrated-information theory,” which holds that consciousness relies on communication between different brain areas, and fades as that communication breaks down.
  • neural markers of consciousness—or more precisely, the loss of consciousness—a group led by Patrick Purdon
  • ...9 more annotations...
  • The purpose of the surgery was to remove electrodes that had previously been implanted in the patients’ brains to monitor seizures. But before they were taken out, the electrodes enabled the researchers to study the activity of individual neurons in the cortex, in addition to large-scale brain activity from EEG recordings.
  • importance of communication between discrete groups of neurons, both within the cortex and across brain regions, is analogous to a band playing music, said George Mashour, a neuroscientist and anesthesiologist at the University of Michigan, Ann Arbor. “You need musical information to come together either in time or space to really make sense,”
  • “Consciousness and cognitive activity may be similar. If different areas of the brain aren’t in synch or if a critical area that normally integrates cognitive activity isn’t functioning, you could be rendered unconscious.”
  • , Purdon and colleagues were able to discern a more detailed neural signature of loss of unconsciousness, this time by using EEG alone. Monitoring brain activity in healthy patients for 2 hours as they underwent propofol-induced anesthesia, they observed that as responsiveness fades, high-frequency brain waves (12–35 hertz) rippling across the cortex and the thalamus were replaced by two different brain waves superimposed on top on one another: a low-frequency (<1 hertz) wave and an alpha frequency (8–12 hertz) wave. “These two waves pretty much come at loss of consciousness,”
  • “We’ve started to teach our anesthesiologists how to read this signature on the EEG”
  • Anesthesia is not the only state in which consciousness is lost, of course
  • o measure the gradual breakdown of connectivity between neural networks during natural REM sleep and anesthesia, as well as in brain-injured, unresponsive patients. Using an electromagnetic coil to activate neurons in a small patch of the human cortex, then recording EEG output to track the propagation of those signals to other neuronal groups, the researchers can measure the connectivity between collections of neurons in the cortex and other brain regions.
  • minimally conscious patients, the magnetically stimulated signals propagated fairly far and wide, occasionally reaching distant cortical areas, much like activations seen in locked-in but conscious patients. In patients in a persistent vegetative state, on the other hand, propagation was severely limited—a breakdown of connectivity similar to that observed in previous tests of anesthetized patients. What’s more, in three vegetative patients that later recovered consciousness, the test picked up signs of increased connectivity before clinical signs of improvement became evident.
  • “I think understanding consciousness itself is going to help us find successful [measurement] approaches that are universally applicable,” said Pearce.
Javier E

A Cocktail Party With Readers - NYTimes.com - 0 views

  • Is this a good thing, I wondered, or an epic waste of time?
  • Twitter devotees at The Times tell me the benefits are real. Twitter enables them to be better reporters, for one thing. By selecting a universe of tweeters to follow, they can track news sources of all kinds, including rival journalists. They can create listening posts across every topic they need to monitor.
  • I don’t read everything. I dip into Twitter when I have time. The analogy is a cocktail party. You can’t join every conversation, but you drift through the crowd and stop now and then.”
  • ...7 more annotations...
  • he is planning a possible trip to Mauritania and has used Twitter to query his million-man follower group in search of expertise on the country — with good results.
  • He has used it also for something that blogs and columns just aren’t appropriate for, he said: publishing a hunch.
  • Twitter also enables writers to super-publish their work. A piece may be destined for NYTimes.com or the newspaper, but that doesn’t stand in the way of tweeting out a link to it.
  • If the item is really popular, Twitter is effectively pushing a cloud of links far beyond the reach of The Times’s Web site and print edition.
  • But there is more to it than gathering, understanding and publishing. Twitter, to hear Times staffers talk about it, is an environment. Inside, Mr. Carr said, “you can see what is getting heat and what is not.”
  • I have made an effort to create a persona out of what I report,” he said. “I am always looking for articles, interesting quotes and things that center around my domain. I think people expect that of me, and that is why they follow me on Twitter.”
  • Twitter, it seems apparent, enables journalists to report and publish actively in digital space. It allows them to create their own community. But this can become a self-limiting hive or, as Brian Stelter noted, “an echo chamber.”
Javier E

Does Google Make Us Stupid? - Pew Research Center - 0 views

  • Carr argued that the ease of online searching and distractions of browsing through the web were possibly limiting his capacity to concentrate. "I'm not thinking the way I used to," he wrote, in part because he is becoming a skimming, browsing reader, rather than a deep and engaged reader. "The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas.... If we lose those quiet spaces, or fill them up with ‘content,' we will sacrifice something important not only in our selves but in our culture."
  • force us to get smarter if we are to survive. "Most people don't realize that this process is already under way," he wrote. "In fact, it's happening all around us, across the full spectrum of how we understand intelligence. It's visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity." He argued that while the proliferation of technology and media can challenge humans' capacity to concentrate there were signs that we are developing "fluid intelligence-the ability to find meaning in confusion and solve new problems, independent of acquired knowledge." He also expressed hope that techies will develop tools to help people find and assess information smartly.
  • 76% of the experts agreed with the statement, "By 2020, people's use of the internet has enhanced human intelligence; as people are allowed unprecedented access to more information they become smarter and make better choices. Nicholas Carr was wrong: Google does not make us stupid."
Javier E

Raymond Tallis Takes Out the 'Neurotrash' - The Chronicle Review - The Chronicle of Hig... - 0 views

  • Tallis informs 60 people gathered in a Kent lecture hall that his talk will demolish two "pillars of unwisdom." The first, "neuromania," is the notion that to understand people you must peer into the "intracranial darkness" of their skulls with brain-scanning technology. The second, "Darwinitis," is the idea that Charles Darwin's evolutionary theory can explain not just the origin of the human species—a claim Tallis enthusiastically accepts—but also the nature of human behavior and institutions.
  • Aping Mankind argues that neuroscientific approaches to things like love, wisdom, and beauty are flawed because you can't reduce the mind to brain activity alone.
  • Stephen Cave, a Berlin-based philosopher and writer who has called Aping Mankind "an important work," points out that most philosophers and scientists do in fact believe "that mind is just the product of certain brain activity, even if we do not currently know quite how." Tallis "does both the reader and these thinkers an injustice" by declaring that view "obviously" wrong,
  • ...5 more annotations...
  • Geraint Rees, director of University College London's Institute of Cognitive Neuroscience, complains that reading Tallis is "a bit like trying to nail jelly to the wall." He "rubbishes every current theory of the relationship between mind and brain, whether philosophical or neuroscientific," while offering "little or no alternative,"
  • cultural memes. The Darwinesque concept originates in Dawkins's 1976 book, The Selfish Gene. Memes are analogous to genes, Dennett has said, "replicating units of culture" that spread from mind to mind like a virus. Religion, chess, songs, clothing, tolerance for free speech—all have been described as memes. Tallis considers it absurd to talk of a noun-phrase like "tolerance for free speech" as a discrete entity. But Dennett argues that Tallis's objections are based on "a simplistic idea of what one might mean by a unit." Memes aren't units? Well, in that spirit, says Dennett, organisms aren't units of biology, nor are species—they're too complex, with too much variation. "He's got to allow theory to talk about entities which are not simple building blocks," Dennett says.
  • How is it that he perceives the glass of water on the table? How is it that he feels a sense of self over time? How is it that he can remember a patient he saw in 1973, and then cast his mind forward to his impending visit to the zoo? There are serious problems with trying to reduce such things to impulses in the brain, he argues. We can explain "how the light gets in," he says, but not "how the gaze looks out." And isn't it astonishing, he adds, that much neural activity seems to have no link to consciousness? Instead, it's associated with things like controlling automatic movements and regulating blood pressure. Sure, we need the brain for consciousness: "Chop my head off, and my IQ descends." But it's not the whole story. There is more to perceptions, memories, and beliefs than neural impulses can explain. The human sphere encompasses a "community of minds," Tallis has written, "woven out of a trillion cognitive handshakes of shared attention, within which our freedom operates and our narrated lives are led." Those views on perception and memory anchor his attack on "neurobollocks." Because if you can't get the basics right, he says, then it's premature to look to neuroscience for clues to complex things like love.
  • Yes, many unanswered questions persist. But these are early days, and neuroscience remains immature, says Churchland, a professor emerita of philosophy at University of California at San Diego and author of the subfield-spawning 1986 book Neurophilosophy. In the 19th century, she points out, people thought we'd never understand light. "Well, by gosh," she says, "by the time the 20th century rolls around, it turns out that light is electromagnetic radiation. ... So the fact that at a certain point in time something seems like a smooth-walled mystery that we can't get a grip on, doesn't tell us anything about whether some real smart graduate student is going to sort it out in the next 10 years or not."
  • Dennett claims he's got much of it sorted out already. He wrote a landmark book on the topic in 1991, Consciousness Explained. (The title "should have landed him in court, charged with breach of the Trade Descriptions Act," writes Tallis.) Dennett uses the vocabulary of computer science to explain how consciousness emerges from the huge volume of things happening in the brain all at once. We're not aware of everything, he tells me, only a "limited window." He describes that stream of consciousness as "the activities of a virtual machine which is running on the parallel hardware of the brain." "You—the fruits of all your experience, not just your genetic background, but everything you've learned and done and all your memories—what ties those all together? What makes a self?" Dennett asks. "The answer is, and has to be, the self is like a software program that organizes the activities of the brain."
Javier E

The Central Question: Is It 1938? - The Atlantic - 0 views

  • differences on Iran policy correspond to answers to this one question: Whether the world of 2015 is fundamentally similar to, or different from, the world of 1938.
  • the idea of recurring historic episodes has a powerful effect on decision-making in the here and now. Disagreements over policy often come down to the search for the right historic pattern to apply.
  • the idea that Europe on the eve of the Holocaust is the most useful guide to the world in 2015 runs through arguments about Iran policy. And if that is the correct model to apply, the right "picture in our heads" as Walter Lippmann put it in Public Opinion, then these conclusions naturally follow:
  • ...8 more annotations...
  • • The threatening power of the time—Nazi Germany then, the Islamists' Iran now—is a force of unalloyed evil whose very existence threatens decent life everywhere.
  • • That emerging power cannot be reasoned or bargained with but must ultimately be stopped and broken
  • • "Compromisers" are in fact appeasers who are deluding themselves about these realities
  • • The appeasers' blindness endangers people all around the world but poses an especially intolerable threat to Jews
  • • As a result of all these factors, no deal with such an implacable enemy is preferable to an inevitably flawed and Munich-like false-hope deal.
  • Also, and crucially, it means that the most obvious criticism of the speech—what's Netanyahu's plan for getting Iran to agree?—is irrelevant. What was the Allies' "plan" for getting Hitler to agree? The plan was to destroy his regime.
  • If, on the other hand, you think that the contrasts with 1938 are more striking than the similarities, you see things differently. As a brief reminder of the contrasts: the Germany of 1938 was much richer and more powerful than the Iran of today. Germany was rapidly expansionist; Iran, despite its terrorist work through proxies, has not been. The Nazi leaders had engulfed the world in war less than a decade after taking power. Iran's leaders, oppressive and destructive, have not shown similar suicidal recklessness. European Jews of 1938 were stateless, unarmed, and vulnerable. Modern Israel is a powerful, nuclear-armed force. Moreover, the world after the first wartime use of nuclear weapons, of course by the United States, is different from the world before that point.
  • Here's what I understand the more clearly after these past few weeks' drama over Prime Minister Netanyahu's speech. These differences in historic model are deep and powerful, and people with one model in mind are not going to convince people with the other mental picture.
Javier E

Baseball or Soccer? - NYTimes.com - 1 views

  • Baseball is a team sport, but it is basically an accumulation of individual activities. Throwing a strike, hitting a line drive or fielding a grounder is primarily an individual achievement. The team that performs the most individual tasks well will probably win the game.
  • In soccer, almost no task, except the penalty kick and a few others, is intrinsically individual. Soccer, as Simon Critchley pointed out recently in The New York Review of Books, is a game about occupying and controlling space. If you get the ball and your teammates have run the right formations, and structured the space around you, you’ll have three or four options on where to distribute it. If the defenders have structured their formations to control the space, then you will have no options. Even the act of touching the ball is not primarily defined by the man who is touching it; it is defined by the context created by all the other players.
  • Most of us spend our days thinking we are playing baseball, but we are really playing soccer. We think we individually choose what career path to take, whom to socialize with, what views to hold. But, in fact, those decisions are shaped by the networks of people around us more than we dare recognize.
  • ...9 more annotations...
  • “Soccer is a collective game, a team game, and everyone has to play the part which has been assigned to them, which means they have to understand it spatially, positionally and intelligently and make it effective.”
  • Then there is the structure of your network. There is by now a vast body of research on how differently people behave depending on the structure of the social networks. People with vast numbers of acquaintances have more job opportunities than people with fewer but deeper friendships
  • This influence happens through at least three avenues. First there is contagion. People absorb memes, ideas and behaviors from each other the way they catch a cold.
  • soccer is like a 90-minute anxiety dream — one of those frustrating dreams when you’re trying to get somewhere but something is always in the way. This is yet another way soccer is like life.
  • Let me simplify it with a classic observation: Each close friend you have brings out a version of yourself that you could not bring out on your own. When your close friend dies, you are not only losing the friend, you are losing the version of your personality that he or she elicited.
  • Once we acknowledge that, in life, we are playing soccer, not baseball, a few things become clear. First, awareness of the landscape of reality is the highest form of wisdom. It’s not raw computational power that matters most; it’s having a sensitive attunement to the widest environment,
  • Second, predictive models will be less useful. Baseball is wonderful for sabermetricians. In each at bat there is a limited range of possible outcomes. Activities like soccer are not as easily renderable statistically, because the relevant spatial structures are harder to quantify
  • Finally, there is the power of the extended mind. There is also a developed body of research on how much our very consciousness is shaped by the people around us.
  • Is life more like baseball, or is it more like soccer?
Javier E

André Glucksmann, French Philosopher Who Renounced Marxism, Dies at 78 - The ... - 0 views

  • In 1975, in “The Cook and the Cannibal,” Mr. Glucksmann subjected Marxism to a scalding critique. Two years later, he broadened his attack in his most influential work, “The Master Thinkers,” which drew a direct line from the philosophies of Marx, Hegel, Fichte and Nietzsche to the enormities of Nazism and Soviet Communism. It was they, he wrote in his conclusion, who “erected the mental apparatus which is indispensable for launching the grand final solutions of the 20th century.”
  • An instant best seller, the book put him in the company of several like-minded former radicals, notably Bernard-Henri Lévy and Pascal Bruckner. Known as the nouveaux philosophes, a term coined by Mr. Lévy, they became some of France’s most prominent public intellectuals, somewhat analogous to the neoconservatives in the United States, but with a lingering leftist orientation.
  • Their apostasy sent shock waves through French intellectual life, and onward to Moscow, which depended on the cachet afforded by Jean-Paul Sartre and other leftist philosophers
  • ...8 more annotations...
  • “It was André Glucksmann who dealt the decisive blow to Communism in France,”
  • “In the West, he presented the anti-totalitarian case more starkly and more passionately than anyone else in modern times,
  • “He was a passionate defender of the superoppressed, whether it was the prisoners of the Gulag, the Bosnians and Kosovars, gays during the height of the AIDS crisis, the Chechens under Putin or the Iraqis under Saddam,” he said. “When he turned against Communism, it was because he realized that Communists were not on the same side.”
  • After earning the teaching degree known as an agrégation from the École Normale Supérieure de Saint-Cloud in 1961, Mr. Glucksmann enrolled in the National Center for Scientific Research to pursue a doctorate under Raymond Aron — an odd matchup because Aron was France’s leading anti-Marxist intellectual.
  • His subsequent turn away from Marxism made him a reviled figure on the left, and former comrades looked on aghast as he became one of France’s most outspoken defenders of the United States. He argued for President Ronald Reagan’s policy of nuclear deterrence toward the Soviet Union, intervention in the Balkans and both American invasions of Iraq. In 2007, he supported the candidacy of Nicolas Sarkozy for the French presidency.
  • “There is the Glucksmann who was right and the Glucksmann who could — with the same fervor, the same feeling of being in the right — be wrong,” Mr. Lévy wrote in a posthumous appreciation for Le Monde. “What set him apart from others under such circumstances is that he would admit his error, and when he came around he was fanatical about studying his mistake, mulling it over, understanding it.”
  • In his most recent book, “Voltaire Counterattacks,” published this year, he positioned France’s greatest philosopher, long out of favor, as a penetrating voice perfectly suited to the present moment.
  • “I think thought is an individual action, not one of a party,” Mr. Glucksmann told The Chicago Tribune in 1991. “First you think. And if that corresponds with the Left, then you are of the Left; if Right, then you are of the Right. But this idea of thinking Left or Right is a sin against the spirit and an illusion.”
Javier E

It's Time for a Real Code of Ethics in Teaching - Noah Berlatsky - The Atlantic - 3 views

  • More 5inShare Email Print A defendant in the Atlanta Public Schools case turns herself in at the Fulton County Jail on April 2. (David Goldman/AP) Earlier this week at The Atlantic, Emily Richmond asked whether high-stakes testing caused the Atlanta schools cheating scandal. The answer, I would argue, is yes... just not in the way you might think. Tests don't cause unethical behavior. But they did cause the Atlanta cheating scandal, and they are doing damage to the teaching profession. The argument that tests do not cause unethical behavior is fairly straightforward, and has been articulated by a number of writers. Jonathan Chait quite correctly points out that unethical behavior occurs in virtually all professions -- and that it occurs particularly when there are clear incentives to succeed. Incentivizing any field increases the impetus to cheat. Suppose journalism worked the way teaching traditionally had. You get hired at a newspaper, and your advancement and pay are dictated almost entirely by your years on the job, with almost no chance of either becoming a star or of getting fired for incompetence. Then imagine journalists changed that and instituted the current system, where you can get really successful if your bosses like you or be fired if they don't. You could look around and see scandal after scandal -- phone hacking! Jayson Blair! NBC's exploding truck! Janet Cooke! Stephen Glass! -- that could plausibly be attributed to this frightening new world in which journalists had an incentive to cheat in order to get ahead. It holds true of any field. If Major League Baseball instituted tenure, and maybe used tee-ball rules where you can't keep score and everybody gets a chance to hit, it could stamp out steroid use. Students have been cheating on tests forever -- massive, systematic cheating, you could say. Why? Because they have an incentive to do well. Give teachers and administrators an incentive for their students to do well, and more of them will cheat. For Chait, then, teaching has just been made more like journalism or baseball; it has gone from an incentiveless occupation to one with incentives.
  • Chait refers to violations of journalistic ethics -- like the phone-hacking scandal -- and suggests they are analogous to Major-League steroid use, and that both are similar to teachers (or students) cheating on tests. But is phone hacking "cheating"
  • Phone hacking was, then, not an example of cheating. It was a violation of professional ethics. And those ethics are not arbitrarily imposed, but are intrinsic to the practice of journalism as a profession committed to public service and to truth.
  • ...8 more annotations...
  • Behaving ethically matters, but how it matters, and what it means, depends strongly on the context in which it occurs.
  • Ethics for teachers is not, apparently, first and foremost about educating their students, or broadening their minds. Rather, ethics for teachers in our current system consists in following the rules. The implicit, linguistic signal being given is that teachers are not like journalists or doctors, committed to a profession and to the moral code needed to achieve their professional goals. Instead, they are like athletes playing games, or (as Chait says) like children taking tests.
  • Using "cheating" as an ethical lens tends to both trivialize and infantilize teacher's work
  • Professions with social respect and social capital, like doctors and lawyers, collaborate in the creation of their own standards. The assumption is that those standards are intrinsic to the profession's goals, and that, therefore, professionals themselves are best equipped to establish and monitor them. Teachers' standards, though, are imposed from outside -- as if teachers are children, or as if teaching is a game.
  • High-stakes testing, then, does leads to cheating. It does not create unethical behavior -- but it does create the particular unethical behavior of "cheating."
  • We have reached a point where we can only talk about the ethics of the profession in terms of cheating or not cheating, as if teachers' main ethical duty is to make sure that scantron bubbles get filled in correctly. Teachers, like journalists, should have a commitment to truth; like doctors, they have a duty of care. Translating those commitments and duties into a bureaucratized measure of cheating-or-not-cheating diminishes ethic
  • For teachers it is, literally, demoralizing. It severs the moral experience of teaching from the moral evaluation of teaching, which makes it almost impossible for good teachers (in all the senses of "good") to stay in the system.
  • We need better ethics for teachers -- ethics that treat them as adults and professionals, not like children playing games.
Javier E

Yes, we should be outraged about Facebook - The Washington Post - 0 views

  • Data mining, as Burdick’s book shows, is not new. But today’s social media companies do it more extensively and more efficiently.
  • Consider an imperfect but instructive analogy. Any campaign can acquire your listed landline number. But no campaign is permitted access to your hopes, fears, worries, passions or day-to-day business by way of a phone tap. Facebook’s accumulated information may not be quite like a tap. But the company sure knows a whole lot about you.
  • We must decide when Facebook and comparable companies should be held accountable as public utilities. And when do they look more like publishers who bear responsibility for the veracity of the “information” they spread around?
  • ...1 more annotation...
  • We also need to confront conflicts between the public interest and the ways that social media companies make their profits. Where do privacy rights come in? Are they unduly blocking transparency about how political campaigns are conducted and who is financing them?
sissij

I know they've seen my message - so why haven't they replied? | Culture | The Guardian - 0 views

  • Ah, the tyranny of read receipts – enough to put you off digital communication for good.
  • It sounds straightforward enough, even perfunctory, and indeed it is if it’s only a blip in the back-and-forth. But when a message lingers on “seen” without explanation for anything beyond a few minutes, you’ve been “left on read”. It’s enough to make even the most self-assured individuals question their worth.
  • It works both ways, too: if you’ve read a message that you’re either unable or unwilling to respond to immediately, the countdown has already started.
  • ...10 more annotations...
  • You never picture them driving, or in the bath, or with relatives who don’t believe in phones at the table. In the moment, the likelihood of their secretly resenting you, or agonising over a reply that is certain to disappoint, seems far greater than it actually is.
  • The anxiety of being left on read is silly but it is real, and unique to this time. There is no analog equivalent.
  • but in that case I’d like to think you’d give them the benefit of the doubt, and assume they’d fallen over in the shower or something.
  • There’s no such goodwill in web 2.0, when everyone is assumed to be available at all times. And if not – it’s personal.
  • well, is it any wonder anxiety is so rife among Generation Y?
  • People will go to some lengths to avoid being seen to have “seen” a message – on Snapchat and Facebook, downloading the message then turning on flight mode and opening it can stop it from registering as opened.
  • Turning on “previews” that display on the lock screen will, in many cases, show enough to get the gist of a message (“I think we should break ... ”) without opening it.
  • But while some people contort themselves to avoid being seen to have “seen”, others manipulate that anxiety to their own ends.
  • But maybe read receipts and the games people play with them have just ruined my ability to trust.
  • When we’re used to good things happening instantly, time taken to craft a thoughtful reply is considered a bad thing.
  •  
    I totally agree with the author that the read receipts should be optional. I personally have some issue with the read receipt because I don't like to reply instantly except it is a urgent message. I like to take some time to think about what I want to comment on or write back. Although the society now likes to be fast and instant, I am still a slow person. I feel the read receipt is forcing me and giving me pressure to be fast and instant. 
Javier E

Video Games Aren't Addictive - The New York Times - 0 views

  • the neuroscientific analogy: that the areas in the brain associated with the pleasures of drug use are the same as those associated with the pleasures of playing video games. This is true but not illuminating. These areas of the brain — those that produce and respond to the neurotransmitter dopamine — are involved in just about any pleasurable activity: having sex, enjoying a nice conversation, eating good food, reading a book, using methamphetamines.
  • A large-scale study of internet-based games recently published in the American Journal of Psychiatry bears out our skepticism about this “addiction.” Using the American Psychiatric Association’s own metrics for ascertaining psychiatric disorder, the study’s researchers found that at most 1 percent of video game players might exhibit characteristics of an addiction and that the games were significantly less addictive than, say, gambling.
  • More damning, the study found that almost none of those classified as being possibly addicted to video games experienced negative outcomes from this addiction. That is, the mental, physical and social health of these potential “addicts” was not different from that of individuals who were not addicted to video games. This suggests that the diagnosis of addiction doesn’t make much sense to begin with
  • ...1 more annotation...
  • Consider a common diagnostic question used to help identify addiction, such as “I always use X to relax after a bad day.” Well, if X is methamphetamine, that’s a worrisome choice, one that presumably indicates addiction. But if X is playing video games, how is that different from unwinding after work by knitting, watching sports or playing bridge?
Javier E

When scientists saw the mouse heads glowing, they knew the discovery was big - The Wash... - 0 views

  • have found evidence linking problems in the lymphatic and glymphatic systems to Alzheimer’s. In a study on mice, they showed that glymphatic dysfunction contributes to the buildup in the brain of amyloid beta, a protein that plays a key role in the disease.
  • several colleagues examined postmortem tissue from 79 human brains. They focused on aquaporin-4, a key protein in glymphatic vessels. In the brains of people with Alzheimer’s, this protein was jumbled; in those without the disease, the protein was well organized. This suggests that glymphatic breakdowns may play a role in the disease
  • The vessels have also been implicated in autoimmune disease. Researchers knew that the immune system has limited access to the brain. But at the same time, the immune system kept tabs on the brain’s status; no one knew exactly how. Some researchers theorize that the glymphatic system could be the conduit and that in diseases such as multiple sclerosis — where the body’s immune system attacks certain brain cells — the communication may go awry.
  • ...7 more annotations...
  • The system may also play a role in symptoms of traumatic brain injur
  • Mice are a good model, she says, because their glymphatic systems are very similar to humans’. She and Iliff found that even months after being injured, the animals’ brains were still not clearing waste efficiently, leading to a buildup of toxic compounds, including amyloid beta. Nedergaard returns to the dishwasher analogy. “It’s like if you only use a third of the water when you turn on the machine,” she says. “You won’t get clean dishes.”
  • in mice, omega-3 fatty acidsimproved glymphatic functioning.
  • Nedergaard has shown that at least in mice, the system processes twice as much fluid during sleep as it does during wakefulness. She and her colleagues focused on amyloid beta; they found that the lymphatic system removed much more of the protein when the animals were asleep than when they were awake. She suggests that over time, sleep dysfunction may contribute to Alzheimer’s and perhaps other brain illnesses. “You only clean your brain when you’re sleeping,” she says. “This is probably an important reason that we sleep. You need time off from consciousness to do the housekeeping.”
  • Sleeping on your stomach is also not very effective; sleeping on your back is somewhat better, while lying on your side appears to produce the best results.
  • glymphatic flow is significantly decreased in the period just before a migraine. The intense pain in these headaches is caused largely by inflamed nerves in the tissue that surrounds the brain. Neuroscientists Rami Burstein and Aaron Schain, the lead authors, theorize that faulty clearance of molecular waste from the brain could trigger inflammation in these pain fibers.
  • other scientists have found that deep breathing significantly increases the glymphatic transport of cerebrospinal fluid into the brain.
anonymous

VHS Tapes Are Worth Money - The New York Times - 0 views

  • Who Is Still Buying VHS Tapes?
  • Despite the rise of streaming, there is still a vast library of moving images that are categorically unavailable anywhere else. Also a big nostalgia factor.
  • The last VCR, according to Dave Rodriguez, 33, a digital repository librarian at Florida State University in Tallahassee, Fla., was produced in 2016
  • ...33 more annotations...
  • But the VHS tape itself may be immortal.
  • Today, a robust marketplace exists, both virtually and in real life, for this ephemera.
  • “Hold steady. Price seems fair. It is a Classic.”
  • Driving the passionate collection of this form of media is the belief that VHS offers something that other types of media cannot.
  • “The general perception that people can essentially order whatever movie they want from home is flat-out wrong,”
  • “promised as a giant video store on the internet, where a customer was only one click away from the exact film they were looking for.”
  • “Anything that you can think of is on VHS tape, because, you’ve got to think, it was a revolutionary piece of the media,”
  • “It was a way for everyone to capture something and then put it out there.”
  • preservation
  • “just so much culture packed into VHS,”
  • a movie studio, an independent filmmaker, a parent shooting their kid’s first steps, etc.
  • finds the medium inspirational
  • “some weird, obscure movie on VHS I would have seen at my friend’s house, late at night, after his parents were asleep.
  • “The quality feels raw but warm and full of flavor,” he said of VHS.
  • views them as a byway connecting her with the past
  • from reels depicting family gatherings to movies that just never made the jump to DVD
  • “I think we were the last to grow up without the internet, cellphones or social media,” and clinging to the “old analog ways,” she said, feels “very natural.”
  • “I think that people are nostalgic for the aura of the VHS era,”
  • “So many cultural touch points are rooted there,” Mr. Harris said of the 1980s.
  • It was, he believes, “a time when, in some ways, Americans knew who we were.”
  • Not only could film connoisseurs peruse the aisles of video stores on Friday nights, but they could also compose home movies, from the artful to the inane
  • “In its heyday, it was mass-produced and widely adopted,”
  • She inherited some of them from her grandmother, a children’s librarian with a vast collection.
  • Historical Journal of Film, Radio and Television
  • the first technology that allowed mass, large-scale home media access to films.”
  • Mr. Arrow said that home videos captured on VHS, or taped television programs that contain old commercials and snippets from the news, are particularly insightful in diving into cultural history.
  • “There’ll be a news break, and you’ll see, like: Oh my god, O.J.’s still in the Bronco, and it’s on the news, and then it’ll cut back to ‘Mission Impossible’ or something.”
  • Marginalized communities, Mr. Harris said, who were not well represented in media in the 1980s, benefited from VHS technology, which allowed them to create an archival system that now brings to life people and communities that were otherwise absent from the screen.
  • The nature of VHS, Mr. Harris said, made self-documentation “readily available,
  • people who lacked representation could “begin to build a library, an archive, to affirm their existence and that of their community.”
  • VHS enthusiasts agree that these tapes occupy an irreplaceable place in culture.
  • “It’s like a time capsule,”
  • “The medium is like no other.”
anonymous

Why Did the Dean of the Most Diverse Law School in the Country Cancel Herself? - The Ne... - 0 views

  • Why Did the Dean of the Most Diverse Law School in the Country Cancel Herself?
  • Was it the unfortunate use of a single word? Or something far more complicated?
  • Mary Lu Bilek, who has spent 32 years at the law school at the City University of New York, the past five of them as dean, sent an email to students and faculty with the subject line: “Apology.”
  • ...23 more annotations...
  • Discussing a contentious issue of race and tenure in a committee meeting last fall, she had likened herself to a “slaveholder.”
  • It was a strange, deeply jarring thing to say, but she had been trying to make the point that her position left her responsible for whatever racial inequities might exist institutionally
  • What the dean might have regarded as an admission of culpability, some of her colleagues viewed as an expression of the buried prejudices well-intentioned liberals never think they have.
  • Ms. Bilek quickly realized that she had drawn a terrible — perhaps unforgivable — analogy
  • “begun education and counseling to uncover and overcome my biases.”
  • To colleagues in the field, the circumstances of Ms. Bilek’s departure struck a note that was both ironic and painful.
  • Decades ago, long before it became commonplace, Ms. Bilek railed against the bar exam and other standardized tests for their disparate impact on low-income students
  • She had presented herself and the institution as “anti-racist,” they wrote, while ignoring how her own decisions perpetuated “institutional racism.”
  • On the face of things, it seemed as though Ms. Bilek had been lost to the maw of cancel culture and its relentless appetite for hapless boomer prey.
  • “I regret that my mistake means that I will not be doing that work” — the work of fighting racism — “with my CUNY colleagues,”
  • “Her reputation in the world of deans is that of someone who cares deeply about racial justice,”
  • Prestige in academia begins, of course, with tenure. Ms. Bilek’s troubles started last spring when she argued for granting early tenure, an extremely precious commodity, to someone about to become an administrator — a young white woman named Allie Robbins
  • Without tenure, administrative work in a university is an especially oppressive time suck, robbing an academic of the hours that could be spent on research and writing and conference-going — essentially, what is required for tenure.
  • Beyond that, the risk of alienating people who someday might weigh in on your own tenure case remained high.
  • As the fall progressed, anger continued to foment around Ms. Bilek.
  • The day after Christmas, 22 faculty members wrote a letter denouncing her wish to leapfrog a white junior academic in the promotion process, her “slaveholder” reference, and what they viewed as her resistance to listen to faculty members of color on the personnel committee “as they pointed out the disparate racial impacts” of her conduct.
  • “But I am certain that the work they do within the Law School and in the world will bring us to a more equal, anti-racist society.”
  • Next came a list of demands that included a public apology for her misdeeds, changes to practices in governance and a retreat from any outside roles furthering the perception that she was “an anti-racist dean.”
  • “We intentionally chose not to ask her to step down but to demand instead that she commit to the systemic work that her stated anti-racist principles required,”
  • “Dean Bilek chose to ignore that outstretched hand.”
  • “We said, ‘We don’t want to make a scene — no single action should define any of us. We don’t want to take away from all the work you’ve done at the law school, but we want the accountability,’”
  • “I thought there was a chance for redemption — we do not want to cancel folks; we are not people who think in carceral ways.”
  • Kept under wraps, news of all this turmoil reached the student body only last week, and when they discovered what Ms. Bilek had said and done and how long they had been left oblivious, a large and vocal faction did not feel as generously
Javier E

Opinion | Humans Are Animals. Let's Get Over It. - The New York Times - 0 views

  • The separation of people from, and the superiority of people to, members of other species is a good candidate for the originating idea of Western thought. And a good candidate for the worst.
  • Like Plato, Hobbes associates anarchy with animality and civilization with the state, which gives to our merely animal motion moral content for the first time and orders us into a definite hierarchy.
  • It is rationality that gives us dignity, that makes a claim to moral respect that no mere animal can deserve. “The moral law reveals to me a life independent of animality,” writes Immanuel Kant in “Critique of Practical Reason.” In this assertion, at least, the Western intellectual tradition has been remarkably consistent.
  • ...15 more annotations...
  • the devaluation of animals and disconnection of us from them reflect a deeper devaluation of the material universe in general
  • In this scheme of things, we owe nature nothing; it is to yield us everything. This is the ideology of species annihilation and environmental destruction, and also of technological development.
  • Further trouble is caused when the distinctions between humans and animals are then used to draw distinctions among human beings
  • Some of us, in short, are animals — and some of us are better than that. This, it turns out, is a useful justification for colonialism, slavery and racism.
  • The classical source for this distinction is certainly Aristotle. In the “Politics,” he writes, “Where then there is such a difference as that between soul and body, or between men and animals (as in the case of those whose business is to use their body, and who can do nothing better), the lower sort are by nature slaves.
  • Every human hierarchy, insofar as it can be justified philosophically, is treated by Aristotle by analogy to the relation of people to animals.
  • One difficult thing to face about our animality is that it entails our deaths; being an animal is associated throughout philosophy with dying purposelessly, and so with living meaninglessly.
  • this line of thought also happens to justify colonizing or even extirpating the “savage,” the beast in human form.
  • Our supposed fundamental distinction from “beasts, “brutes” and “savages” is used to divide us from nature, from one another and, finally, from ourselves
  • In Plato’s “Republic,” Socrates divides the human soul into two parts. The soul of the thirsty person, he says, “wishes for nothing else than to drink.” But we can restrain ourselves. “That which inhibits such actions,” he concludes, “arises from the calculations of reason.” When we restrain or control ourselves, Plato argues, a rational being restrains an animal.
  • In this view, each of us is both a beast and a person — and the point of human life is to constrain our desires with rationality and purify ourselves of animality
  • These sorts of systematic self-divisions come to be refigured in Cartesian dualism, which separates the mind from the body, or in Sigmund Freud’s distinction between id and ego, or in the neurological contrast between the functions of the amygdala and the prefrontal cortex.
  • I don’t know how to refute it, exactly, except to say that I don’t feel myself to be a logic program running on an animal body; I’d like to consider myself a lot more integrated than that.
  • And I’d like to repudiate every political and environmental conclusion ever drawn by our supposed transcendence of the order of nature
  • There is no doubt that human beings are distinct from other animals, though not necessarily more distinct than other animals are from one another. But maybe we’ve been too focused on the differences for too long. Maybe we should emphasize what all us animals have in common.
‹ Previous 21 - 40 of 60 Next ›
Showing 20 items per page