Skip to main content

Home/ TOK Friends/ Group items matching "into" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
39More

Korean philosophy is built upon daily practice of good habits | Aeon Essays - 0 views

  • ‘We are unknown, we knowers, ourselves to ourselves,’ wrote Friedrich Nietzsche at the beginning of On the Genealogy of Morals (1887
  • This seeking after ourselves, however, is not something that is lacking in Buddhist and Confucian traditions – especially not in the case of Korean philosophy. Self-cultivation, central to the tradition, underscores that the onus is on the individual to develop oneself, without recourse to the divine or the supernatural
  • Korean philosophy is practical, while remaining agnostic to a large degree: recognising the spirit realm but highlighting that we ourselves take charge of our lives by taking charge of our minds
  • ...36 more annotations...
  • The word for ‘philosophy’ in Korean is 철학, pronounced ch’ŏrhak. It literally means the ‘study of wisdom’ or, perhaps better, ‘how to become wise’, which reflects its more dynamic and proactive implications
  • At night, in the darkness of the cave, he drank water from a perfectly useful ‘bowl’. But when he could see properly, he found that there was no ‘bowl’ at all, only a disgusting human skull.
  • Our lives and minds are affected by others (and their actions), as others (and their minds) are affected by our actions. This is particularly true in the Korean application of Confucian and Buddhist ideas.
  • Wŏnhyo understood that how we think about things shapes their very existence – and in turn our own existence, which is constructed according to our thoughts.
  • In the Korean tradition of philosophy, human beings are social beings, therefore knowing how to interact with others is an essential part of living a good life – indeed, living well with others is our real contribution to human life
  • he realised that there isn’t a difference between the ‘bowl’ and the skull: the only difference lies with us and our perceptions. We interpret our lives through a continual stream of thoughts, and so we become what we think, or rather how we think
  • As our daily lives are shaped by our thoughts, so our experience of this reality is good or bad – depending on our thoughts – which make things ‘appear’ good or bad because, in ‘reality’, things in and of themselves are devoid of their own independent nature
  • We can take from Wŏnhyo the idea that, if you change the patterns that have become engrained in how you think, you will begin to live differently. To do this, you need to change your mental habits, which is why meditation and mindful awareness can help. And this needs to be practised every day
  • Wŏnhyo’s most important work is titled Awaken your Mind and Practice (in Korean, Palsim suhaeng-jang). It is an explicit call to younger adherents to put Buddhist ideas into practice, and an indirect warning not to get lost in contemplation or in the study of text
  • While Wŏnhyo had emphasised the mind and the need to ‘practise’ Buddhism, a later Korean monk, Chinul (1158-1210), spearheaded Sŏn, the meditational tradition in Korea that espoused the idea of ‘sudden enlightenment’ that alerts the mind, accompanied by ‘gradual cultivation’
  • we still need to practise meditation, for if not we can easily fall into our old ways even if our minds have been awakened
  • his greatest contribution to Sŏn is Secrets on Cultivating the Mind (Susim kyŏl). This text outlines in detail his teachings on sudden awakening followed by the need for gradual cultivation
  • hinul’s approach recognises the mind as the ‘essence’ of one’s Buddha nature (contained in the mind, which is inherently good), while continual practice and cultivation aids in refining its ‘function’ – this is the origin of the ‘essence-function’ concept that has since become central to Korean philosophy.
  • These ideas also influenced the reformed view of Confucianism that became linked with the mind and other metaphysical ideas, finally becoming known as Neo-Confucianism.
  • During the Chosŏn dynasty (1392-1910), the longest lasting in East Asian history, Neo-Confucianism became integrated into society at all levels through rituals for marriage, funerals and ancestors
  • Neo-Confucianism recognises that we as individuals exist through plural relationships with responsibilities to others (as a child, brother/sister, lover, husband/wife, parent, teacher/student and so on), an idea nicely captured in 2000 by the French philosopher Jean-Luc Nancy when he described our ‘being’ as ‘singular plural’
  • Corrupt interpretations of Confucianism by heteronormative men have historically championed these ideas in terms of vertical relationships rather than as a reciprocal set of benevolent social interactions, meaning that women have suffered greatly as a result.
  • Setting aside these sexist and self-serving interpretations, Confucianism emphasises that society works as an interconnected set of complementary reciprocal relationships that should be beneficial to all parties within a social system
  • Confucian relationships have the potential to offer us an example of effective citizenship, similar to that outlined by Cicero, where the good of the republic or state is at the centre of being a good citizen
  • There is a general consensus in Korean philosophy that we have an innate sociability and therefore should have a sense of duty to each other and to practise virtue.
  • The main virtue of Confucianism is the idea of ‘humanity’, coming from the Chinese character 仁, often left untranslated and written as ren and pronounced in Korean as in.
  • It is a combination of the character for a human being and the number two. In other words, it signifies what (inter)connects two people, or rather how they should interact in a humane or benevolent manner to each other. This character therefore highlights the link between people while emphasising that the most basic thing that makes us ‘human’ is our interaction with others.
  • Neo-Confucianism adopted a turn towards a more mind-centred view in the writings of the Korean scholar Yi Hwang, known by his pen name T’oegye (1501-70), who appears on the 1,000-won note. He greatly influenced Neo-Confucianism in Japan through his formidable text, Ten Diagrams on Sage Learning (Sŏnghak sipto), composed in 1568, which was one of the most-reproduced texts of the entire Chosŏn dynasty and represents the synthesis of Neo-Confucian thought in Korea
  • with commentaries that elucidate the moral principles of Confucianism, related to the cardinal relationships and education. It also embodies T’oegye’s own development of moral psychology through his focus on the mind, and illuminates the importance of teaching and the practice of self-cultivation.
  • He writes that we ourselves can transform the unrestrained mind and its desires, and achieve sagehood, if we take the arduous, gradual path of self-cultivation centred on the mind.
  • Confucians had generally accepted the Mencian idea that human nature was embodied in the unaroused state of the mind, before it was shaped by its environment. The mind in its unaroused state was taken to be theoretically good. However, this inborn tendency for goodness is always in danger of being reduced to passivity, unless you cultivate yourself as a person of ‘humanity’ (in the Confucian sense mentioned above).
  • You should constantly try to activate your humanity to allow the unhampered operation of the original mind to manifest itself through socially responsible and moral character in action
  • Humanity is the realisation of what I describe as our ‘optimum level of perfection’ that exists in an inherent stage of potentiality due our innate good nature
  • This, in a sense, is like the Buddha nature of the Buddhists, which suggests we are already enlightened and just need to recover our innate mental state. Both philosophies are hopeful: humans are born good with the potential to correct their own flaws and failures
  • this could hardly contrast any more greatly with the Christian doctrine of original sin
  • The seventh diagram in T’oegye’s text is entitled ‘The Diagram of the Explanation of Humanity’ (Insŏl-to). Here he warns how one’s good inborn nature may become impaired, hampering the operation of the original mind and negatively impacting our character in action. Humanity embodies the gradual realisation of our optimum level of perfection that already exists in our mind but that depends on how we think about things and how we relate that to others in a social context
  • For T’oegye, the key to maintaining our capacity to remain level-headed, and to control our impulses and emotions, was kyŏng. This term is often translated as ‘seriousness’, occasionally ‘mindfulness’, and it identifies the serious need for constant effort to control one’s mind in order to go about one’s life in a healthy manner
  • For T’oegye, mindfulness is as serious as meditation is for the Buddhists. In fact, the Neo-Confucians had their own meditational practice of ‘quiet-sitting’ (chŏngjwa), which focused on recovering the calm and not agitated ‘original mind’, before putting our daily plans into action
  • These diagrams reinforce this need for a daily practice of Confucian mindfulness, because practice leads to the ‘good habit’ of creating (and maintaining) routines. There is no short-cut provided, no weekend intro to this practice: it is life-long, and that is what makes it transformative, leading us to become better versions of who were in the beginning. This is consolation of Korean philosophy.
  • Seeing the world as it is can steer us away from making unnecessary mistakes, while highlighting what is good and how to maintain that good while also reducing anxiety from an agitated mind and harmful desires. This is why Korean philosophy can provide us with consolation; it recognises the bad, but prioritises the good, providing several moral pathways that are referred to in the East Asian traditions (Confucianism, Buddhism and Daoism) as modes of ‘self-cultivation’
  • As social beings, we penetrate the consciousness of others, and so humans are linked externally through conduct but also internally through thought. Humanity is a unifying approach that holds the potential to solve human problems, internally and externally, as well as help people realise the perfection that is innately theirs
14More

It's Not Just the Discord Leak. Group Chats Are the Internet's New Chaos Machine. - The... - 0 views

  • Digital bulletin-board systems—proto–group chats, you could say—date back to the 1970s, and SMS-style group chats popped up in WhatsApp and iMessage in 2011.
  • As New York magazine put it in 2019, group chats became “an outright replacement for the defining mode of social organization of the past decade: the platform-centric, feed-based social network.”
  • unlike the Facebook feed or Twitter, where posts can be linked to wherever, group chats are a closed system—a safe and (ideally) private space. What happens in the group chat ought to stay there.
  • ...11 more annotations...
  • In every group chat, no matter the size, participants fall into informal roles. There is usually a leader—a person whose posting frequency drives the group or sets the agenda. Often, there are lurkers who rarely chime in
  • Larger group chats are not immune to the more toxic dynamics of social media, where competition for attention and herd behavior cause infighting, splintering, and back-channeling.
  • It’s enough to make one think, as the writer Max Read argued, that “venture-capitalist group chats are a threat to the global economy.” Now you might also say they are a threat to national security.
  • thanks to the private nature of the group chats, this information largely stayed out of the public eye. As Bloomberg reported, “By the time most people figured out that a bank run was a possibility … it was already well underway.”
  • The investor panic that led to the swift collapse of Silicon Valley Bank in March was effectively caused by runaway group-chat dynamics. “It wasn’t phone calls; it wasn’t social media,” a start-up founder told Bloomberg in March. “It was private chat rooms and message groups.
  • Unlike traditional social media or even forums and message boards, group chats are nearly impossible to monitor.
  • as our digital social lives start to splinter off from feeds and large audiences and into siloed areas, a different kind of unpredictability and chaos awaits. Where social networks create a context collapse—a process by which information meant for one group moves into unfamiliar networks and is interpreted by outsiders—group chats seem to be context amplifiers
  • group chats provide strong relationship dynamics, and create in-jokes and lore. For decades, researchers have warned of the polarizing effects of echo chambers across social networks; group chats realize this dynamic fully.
  • Weird things happen in echo chambers. Constant reinforcement of beliefs or ideas might lead to group polarization or radicalization. It may trigger irrational herd behavior such as, say, attempting to purchase a copy of the Constitution through a decentralized autonomous organization
  • Obsession with in-group dynamics might cause people to lose touch with the reality outside the walls of a particular community; the private-seeming nature of a closed group might also lull participants into a false sense of security, as it did with Teixiera.
  • the age of the group chat appears to be at least as unpredictable, swapping a very public form of volatility for a more siloed, incalculable version
46More

The new science of death: 'There's something happening in the brain that makes no sense... - 0 views

  • Jimo Borjigin, a professor of neurology at the University of Michigan, had been troubled by the question of what happens to us when we die. She had read about the near-death experiences of certain cardiac-arrest survivors who had undergone extraordinary psychic journeys before being resuscitated. Sometimes, these people reported travelling outside of their bodies towards overwhelming sources of light where they were greeted by dead relatives. Others spoke of coming to a new understanding of their lives, or encountering beings of profound goodness
  • Borjigin didn’t believe the content of those stories was true – she didn’t think the souls of dying people actually travelled to an afterworld – but she suspected something very real was happening in those patients’ brains. In her own laboratory, she had discovered that rats undergo a dramatic storm of many neurotransmitters, including serotonin and dopamine, after their hearts stop and their brains lose oxygen. She wondered if humans’ near-death experiences might spring from a similar phenomenon, and if it was occurring even in people who couldn’t be revived
  • when she looked at the scientific literature, she found little enlightenment. “To die is such an essential part of life,” she told me recently. “But we knew almost nothing about the dying brain.” So she decided to go back and figure out what had happened inside the brains of people who died at the University of Michigan neurointensive care unit.
  • ...43 more annotations...
  • Since the 1960s, advances in resuscitation had helped to revive thousands of people who might otherwise have died. About 10% or 20% of those people brought with them stories of near-death experiences in which they felt their souls or selves departing from their bodies
  • According to several international surveys and studies, one in 10 people claims to have had a near-death experience involving cardiac arrest, or a similar experience in circumstances where they may have come close to death. That’s roughly 800 million souls worldwide who may have dipped a toe in the afterlife.
  • In the 1970s, a small network of cardiologists, psychiatrists, medical sociologists and social psychologists in North America and Europe began investigating whether near-death experiences proved that dying is not the end of being, and that consciousness can exist independently of the brain. The field of near-death studies was born.
  • in 1975, an American medical student named Raymond Moody published a book called Life After Life.
  • Meanwhile, new technologies and techniques were helping doctors revive more and more people who, in earlier periods of history, would have almost certainly been permanently deceased.
  • “We are now at the point where we have both the tools and the means to scientifically answer the age-old question: What happens when we die?” wrote Sam Parnia, an accomplished resuscitation specialist and one of the world’s leading experts on near-death experiences, in 2006. Parnia himself was devising an international study to test whether patients could have conscious awareness even after they were found clinically dead.
  • Borjigin, together with several colleagues, took the first close look at the record of electrical activity in the brain of Patient One after she was taken off life support. What they discovered – in results reported for the first time last year – was almost entirely unexpected, and has the potential to rewrite our understanding of death.
  • “I believe what we found is only the tip of a vast iceberg,” Borjigin told me. “What’s still beneath the surface is a full account of how dying actually takes place. Because there’s something happening in there, in the brain, that makes no sense.”
  • Over the next 30 years, researchers collected thousands of case reports of people who had had near-death experiences
  • Moody was their most important spokesman; he eventually claimed to have had multiple past lives and built a “psychomanteum” in rural Alabama where people could attempt to summon the spirits of the dead by gazing into a dimly lit mirror.
  • near-death studies was already splitting into several schools of belief, whose tensions continue to this day. One influential camp was made up of spiritualists, some of them evangelical Christians, who were convinced that near-death experiences were genuine sojourns in the land of the dead and divine
  • It is no longer unheard of for people to be revived even six hours after being declared clinically dead. In 2011, Japanese doctors reported the case of a young woman who was found in a forest one morning after an overdose stopped her heart the previous night; using advanced technology to circulate blood and oxygen through her body, the doctors were able to revive her more than six hours later, and she was able to walk out of the hospital after three weeks of care
  • The second, and largest, faction of near-death researchers were the parapsychologists, those interested in phenomena that seemed to undermine the scientific orthodoxy that the mind could not exist independently of the brain. These researchers, who were by and large trained scientists following well established research methods, tended to believe that near-death experiences offered evidence that consciousness could persist after the death of the individua
  • Their aim was to find ways to test their theories of consciousness empirically, and to turn near-death studies into a legitimate scientific endeavour.
  • Finally, there emerged the smallest contingent of near-death researchers, who could be labelled the physicalists. These were scientists, many of whom studied the brain, who were committed to a strictly biological account of near-death experiences. Like dreams, the physicalists argued, near-death experiences might reveal psychological truths, but they did so through hallucinatory fictions that emerged from the workings of the body and the brain.
  • Between 1975, when Moody published Life After Life, and 1984, only 17 articles in the PubMed database of scientific publications mentioned near-death experiences. In the following decade, there were 62. In the most recent 10-year span, there were 221.
  • Today, there is a widespread sense throughout the community of near-death researchers that we are on the verge of great discoveries
  • “We really are in a crucial moment where we have to disentangle consciousness from responsiveness, and maybe question every state that we consider unconscious,”
  • “I think in 50 or 100 years time we will have discovered the entity that is consciousness,” he told me. “It will be taken for granted that it wasn’t produced by the brain, and it doesn’t die when you die.”
  • it is in large part because of a revolution in our ability to resuscitate people who have suffered cardiac arrest
  • In his book, Moody distilled the reports of 150 people who had had intense, life-altering experiences in the moments surrounding a cardiac arrest. Although the reports varied, he found that they often shared one or more common features or themes. The narrative arc of the most detailed of those reports – departing the body and travelling through a long tunnel, having an out-of-body experience, encountering spirits and a being of light, one’s whole life flashing before one’s eyes, and returning to the body from some outer limit – became so canonical that the art critic Robert Hughes could refer to it years later as “the familiar kitsch of near-death experience”.
  • Loss of oxygen to the brain and other organs generally follows within seconds or minutes, although the complete cessation of activity in the heart and brain – which is often called “flatlining” or, in the case of the latter, “brain death” – may not occur for many minutes or even hours.
  • That began to change in 1960, when the combination of mouth-to-mouth ventilation, chest compressions and external defibrillation known as cardiopulmonary resuscitation, or CPR, was formalised. Shortly thereafter, a massive campaign was launched to educate clinicians and the public on CPR’s basic techniques, and soon people were being revived in previously unthinkable, if still modest, numbers.
  • scientists learned that, even in its acute final stages, death is not a point, but a process. After cardiac arrest, blood and oxygen stop circulating through the body, cells begin to break down, and normal electrical activity in the brain gets disrupted. But the organs don’t fail irreversibly right away, and the brain doesn’t necessarily cease functioning altogether. There is often still the possibility of a return to life. In some cases, cell death can be stopped or significantly slowed, the heart can be restarted, and brain function can be restored. In other words, the process of death can be reversed.
  • In a medical setting, “clinical death” is said to occur at the moment the heart stops pumping blood, and the pulse stops. This is widely known as cardiac arrest
  • In 2019, a British woman named Audrey Schoeman who was caught in a snowstorm spent six hours in cardiac arrest before doctors brought her back to life with no evident brain damage.
  • That is a key tenet of the parapsychologists’ arguments: if there is consciousness without brain activity, then consciousness must dwell somewhere beyond the brain
  • Some of the parapsychologists speculate that it is a “non-local” force that pervades the universe, like electromagnetism. This force is received by the brain, but is not generated by it, the way a television receives a broadcast.
  • In order for this argument to hold, something else has to be true: near-death experiences have to happen during death, after the brain shuts down
  • To prove this, parapsychologists point to a number of rare but astounding cases known as “veridical” near-death experiences, in which patients seem to report details from the operating room that they might have known only if they had conscious awareness during the time that they were clinically dead.
  • At the very least, Parnia and his colleagues have written, such phenomena are “inexplicable through current neuroscientific models”. Unfortunately for the parapsychologists, however, none of the reports of post-death awareness holds up to strict scientific scrutiny. “There are many claims of this kind, but in my long decades of research into out-of-body and near-death experiences I never met any convincing evidence that this is true,”
  • In other cases, there’s not enough evidence to prove that the experiences reported by cardiac arrest survivors happened when their brains were shut down, as opposed to in the period before or after they supposedly “flatlined”. “So far, there is no sufficiently rigorous, convincing empirical evidence that people can observe their surroundings during a near-death experience,”
  • The parapsychologists tend to push back by arguing that even if each of the cases of veridical near-death experiences leaves room for scientific doubt, surely the accumulation of dozens of these reports must count for something. But that argument can be turned on its head: if there are so many genuine instances of consciousness surviving death, then why should it have so far proven impossible to catch one empirically?
  • The spiritualists and parapsychologists are right to insist that something deeply weird is happening to people when they die, but they are wrong to assume it is happening in the next life rather than this one. At least, that is the implication of what Jimo Borjigin found when she investigated the case of Patient One.
  • Given the levels of activity and connectivity in particular regions of her dying brain, Borjigin believes it’s likely that Patient One had a profound near-death experience with many of its major features: out-of-body sensations, visions of light, feelings of joy or serenity, and moral re-evaluations of one’s life. Of course,
  • “As she died, Patient One’s brain was functioning in a kind of hyperdrive,” Borjigin told me. For about two minutes after her oxygen was cut off, there was an intense synchronisation of her brain waves, a state associated with many cognitive functions, including heightened attention and memory. The synchronisation dampened for about 18 seconds, then intensified again for more than four minutes. It faded for a minute, then came back for a third time.
  • n those same periods of dying, different parts of Patient One’s brain were suddenly in close communication with each other. The most intense connections started immediately after her oxygen stopped, and lasted for nearly four minutes. There was another burst of connectivity more than five minutes and 20 seconds after she was taken off life support. In particular, areas of her brain associated with processing conscious experience – areas that are active when we move through the waking world, and when we have vivid dreams – were communicating with those involved in memory formation. So were parts of the brain associated with empathy. Even as she slipped irre
  • something that looked astonishingly like life was taking place over several minutes in Patient One’s brain.
  • Although a few earlier instances of brain waves had been reported in dying human brains, nothing as detailed and complex as what occurred in Patient One had ever been detected.
  • In the moments after Patient One was taken off oxygen, there was a surge of activity in her dying brain. Areas that had been nearly silent while she was on life support suddenly thrummed with high-frequency electrical signals called gamma waves. In particular, the parts of the brain that scientists consider a “hot zone” for consciousness became dramatically alive. In one section, the signals remained detectable for more than six minutes. In another, they were 11 to 12 times higher than they had been before Patient One’s ventilator was removed.
  • “The brain, contrary to everybody’s belief, is actually super active during cardiac arrest,” Borjigin said. Death may be far more alive than we ever thought possible.
  • “The brain is so resilient, the heart is so resilient, that it takes years of abuse to kill them,” she pointed out. “Why then, without oxygen, can a perfectly healthy person die within 30 minutes, irreversibly?”
  • Evidence is already emerging that even total brain death may someday be reversible. In 2019, scientists at Yale University harvested the brains of pigs that had been decapitated in a commercial slaughterhouse four hours earlier. Then they perfused the brains for six hours with a special cocktail of drugs and synthetic blood. Astoundingly, some of the cells in the brains began to show metabolic activity again, and some of the synapses even began firing.
3More

As ARM Chief Steps Down, Successor Talks About 'Body Computing' - NYTimes.com - 0 views

  • ARM was originally a project inside Acorn Computer, a personal computer maker long since broken up. From relative obscurity, ARM’s chip designs now make up nearly one-third of new chip consumption, hurting companies like Intel.
  • The big coming focus, Mr. Segars said, will be deploying chips into a sensor-rich world. “Low-cost microcontrollers with a wireless interface,” he said. “There will be billions of these.” The sensor data will be processed both locally, on millions of small computers, with capabilities to make decisions locally, or collected and passed along to even bigger computer systems. “The systems will go through different aggregation points,” Mr. Segars said. “If an aggregator in the home can tell a fridge is using too much power, maybe it needs servicing.”
  • “The car is ripe for a revolution. It will evolve into a consumer electronics device, paying for parking as you pull up to the curb.” Eventually, said Mr. East, “it’s getting into people’s bodies. Over the next several years, semiconductors will be so small and use so little power that they’ll run inside us as systems.”
10More

A Million First Dates - Dan Slater - The Atlantic - 0 views

  • . In his 2004 book, The Paradox of Choice, the psychologist Barry Schwartz indicts a society that “sanctifies freedom of choice so profoundly that the benefits of infinite options seem self-evident.” On the contrary, he argues, “a large array of options may diminish the attractiveness of what people actually choose, the reason being that thinking about the attractions of some of the unchosen options detracts from the pleasure derived from the chosen one.”
  • Psychologists who study relationships say that three ingredients generally determine the strength of commitment: overall satisfaction with the relationship; the investment one has put into it (time and effort, shared experiences and emotions, etc.); and the quality of perceived alternatives. Two of the three—satisfaction and quality of alternatives—could be directly affected by the larger mating pool that the Internet offers.
  • as the range of options grows larger, mate-seekers are liable to become “cognitively overwhelmed,” and deal with the overload by adopting lazy comparison strategies and examining fewer cues. As a result, they are more likely to make careless decisions than they would be if they had fewer options,
  • ...7 more annotations...
  • research elsewhere has found that people are less satisfied when choosing from a larger group: in one study, for example, subjects who selected a chocolate from an array of six options believed it tasted better than those who selected the same chocolate from an array of 30.
  • evidence shows that the perception that one has appealing alternatives to a current romantic partner is a strong predictor of low commitment to that partner.
  • But the pace of technology is upending these rules and assumptions. Relationships that begin online, Jacob finds, move quickly. He chalks this up to a few things. First, familiarity is established during the messaging process, which also often involves a phone call. By the time two people meet face-to-face, they already have a level of intimacy. Second, if the woman is on a dating site, there’s a good chance she’s eager to connect. But for Jacob, the most crucial difference between online dating and meeting people in the “real” world is the sense of urgency. Occasionally, he has an acquaintance in common with a woman he meets online, but by and large she comes from a different social pool. “It’s not like we’re just going to run into each other again,” he says. “So you can’t afford to be too casual. It’s either ‘Let’s explore this’ or ‘See you later.’ ”
  • he phenomenon extends beyond dating sites to the Internet more generally. “I’ve seen a dramatic increase in cases where something on the computer triggered the breakup,” he says. “People are more likely to leave relationships, because they’re emboldened by the knowledge that it’s no longer as hard as it was to meet new people. But whether it’s dating sites, social media, e‑mail—it’s all related to the fact that the Internet has made it possible for people to communicate and connect, anywhere in the world, in ways that have never before been seen.”
  • eople seeking commitment—particularly women—have developed strategies to detect deception and guard against it. A woman might withhold sex so she can assess a man’s intentions. Theoretically, her withholding sends a message: I’m not just going to sleep with any guy that comes along. Theoretically, his willingness to wait sends a message back: I’m interested in more than sex.
  • people who are in marriages that are either bad or average might be at increased risk of divorce, because of increased access to new partners. Third, it’s unknown whether that’s good or bad for society. On one hand, it’s good if fewer people feel like they’re stuck in relationships. On the other, evidence is pretty solid that having a stable romantic partner means all kinds of health and wellness benefits.” And that’s even before one takes into account the ancillary effects of such a decrease in commitment—on children, for example, or even society more broadly.
  • As online dating becomes increasingly pervasive, the old costs of a short-term mating strategy will give way to new ones. Jacob, for instance, notices he’s seeing his friends less often. Their wives get tired of befriending his latest girlfriend only to see her go when he moves on to someone else. Also, Jacob has noticed that, over time, he feels less excitement before each new date. “Is that about getting older,” he muses, “or about dating online?” How much of the enchantment associated with romantic love has to do with scarcity (this person is exclusively for me), and how will that enchantment hold up in a marketplace of abundance (this person could be exclusively for me, but so could the other two people I’m meeting this week)?
22More

Noam Chomsky on Where Artificial Intelligence Went Wrong - Yarden Katz - The Atlantic - 0 views

  • If you take a look at the progress of science, the sciences are kind of a continuum, but they're broken up into fields. The greatest progress is in the sciences that study the simplest systems. So take, say physics -- greatest progress there. But one of the reasons is that the physicists have an advantage that no other branch of sciences has. If something gets too complicated, they hand it to someone else.
  • If a molecule is too big, you give it to the chemists. The chemists, for them, if the molecule is too big or the system gets too big, you give it to the biologists. And if it gets too big for them, they give it to the psychologists, and finally it ends up in the hands of the literary critic, and so on.
  • neuroscience for the last couple hundred years has been on the wrong track. There's a fairly recent book by a very good cognitive neuroscientist, Randy Gallistel and King, arguing -- in my view, plausibly -- that neuroscience developed kind of enthralled to associationism and related views of the way humans and animals work. And as a result they've been looking for things that have the properties of associationist psychology.
  • ...19 more annotations...
  • in general what he argues is that if you take a look at animal cognition, human too, it's computational systems. Therefore, you want to look the units of computation. Think about a Turing machine, say, which is the simplest form of computation, you have to find units that have properties like "read", "write" and "address." That's the minimal computational unit, so you got to look in the brain for those. You're never going to find them if you look for strengthening of synaptic connections or field properties, and so on. You've got to start by looking for what's there and what's working and you see that from Marr's highest level.
  • it's basically in the spirit of Marr's analysis. So when you're studying vision, he argues, you first ask what kind of computational tasks is the visual system carrying out. And then you look for an algorithm that might carry out those computations and finally you search for mechanisms of the kind that would make the algorithm work. Otherwise, you may never find anything.
  • "Good Old Fashioned AI," as it's labeled now, made strong use of formalisms in the tradition of Gottlob Frege and Bertrand Russell, mathematical logic for example, or derivatives of it, like nonmonotonic reasoning and so on. It's interesting from a history of science perspective that even very recently, these approaches have been almost wiped out from the mainstream and have been largely replaced -- in the field that calls itself AI now -- by probabilistic and statistical models. My question is, what do you think explains that shift and is it a step in the right direction?
  • AI and robotics got to the point where you could actually do things that were useful, so it turned to the practical applications and somewhat, maybe not abandoned, but put to the side, the more fundamental scientific questions, just caught up in the success of the technology and achieving specific goals.
  • The approximating unanalyzed data kind is sort of a new approach, not totally, there's things like it in the past. It's basically a new approach that has been accelerated by the existence of massive memories, very rapid processing, which enables you to do things like this that you couldn't have done by hand. But I think, myself, that it is leading subjects like computational cognitive science into a direction of maybe some practical applicability... ..in engineering? Chomsky: ...But away from understanding.
  • I was very skeptical about the original work. I thought it was first of all way too optimistic, it was assuming you could achieve things that required real understanding of systems that were barely understood, and you just can't get to that understanding by throwing a complicated machine at it.
  • if success is defined as getting a fair approximation to a mass of chaotic unanalyzed data, then it's way better to do it this way than to do it the way the physicists do, you know, no thought experiments about frictionless planes and so on and so forth. But you won't get the kind of understanding that the sciences have always been aimed at -- what you'll get at is an approximation to what's happening.
  • Suppose you want to predict tomorrow's weather. One way to do it is okay I'll get my statistical priors, if you like, there's a high probability that tomorrow's weather here will be the same as it was yesterday in Cleveland, so I'll stick that in, and where the sun is will have some effect, so I'll stick that in, and you get a bunch of assumptions like that, you run the experiment, you look at it over and over again, you correct it by Bayesian methods, you get better priors. You get a pretty good approximation of what tomorrow's weather is going to be. That's not what meteorologists do -- they want to understand how it's working. And these are just two different concepts of what success means, of what achievement is.
  • if you get more and more data, and better and better statistics, you can get a better and better approximation to some immense corpus of text, like everything in The Wall Street Journal archives -- but you learn nothing about the language.
  • the right approach, is to try to see if you can understand what the fundamental principles are that deal with the core properties, and recognize that in the actual usage, there's going to be a thousand other variables intervening -- kind of like what's happening outside the window, and you'll sort of tack those on later on if you want better approximations, that's a different approach.
  • take a concrete example of a new field in neuroscience, called Connectomics, where the goal is to find the wiring diagram of very complex organisms, find the connectivity of all the neurons in say human cerebral cortex, or mouse cortex. This approach was criticized by Sidney Brenner, who in many ways is [historically] one of the originators of the approach. Advocates of this field don't stop to ask if the wiring diagram is the right level of abstraction -- maybe it's no
  • if you went to MIT in the 1960s, or now, it's completely different. No matter what engineering field you're in, you learn the same basic science and mathematics. And then maybe you learn a little bit about how to apply it. But that's a very different approach. And it resulted maybe from the fact that really for the first time in history, the basic sciences, like physics, had something really to tell engineers. And besides, technologies began to change very fast, so not very much point in learning the technologies of today if it's going to be different 10 years from now. So you have to learn the fundamental science that's going to be applicable to whatever comes along next. And the same thing pretty much happened in medicine.
  • that's the kind of transition from something like an art, that you learn how to practice -- an analog would be trying to match some data that you don't understand, in some fashion, maybe building something that will work -- to science, what happened in the modern period, roughly Galilean science.
  • it turns out that there actually are neural circuits which are reacting to particular kinds of rhythm, which happen to show up in language, like syllable length and so on. And there's some evidence that that's one of the first things that the infant brain is seeking -- rhythmic structures. And going back to Gallistel and Marr, its got some computational system inside which is saying "okay, here's what I do with these things" and say, by nine months, the typical infant has rejected -- eliminated from its repertoire -- the phonetic distinctions that aren't used in its own language.
  • people like Shimon Ullman discovered some pretty remarkable things like the rigidity principle. You're not going to find that by statistical analysis of data. But he did find it by carefully designed experiments. Then you look for the neurophysiology, and see if you can find something there that carries out these computations. I think it's the same in language, the same in studying our arithmetical capacity, planning, almost anything you look at. Just trying to deal with the unanalyzed chaotic data is unlikely to get you anywhere, just like as it wouldn't have gotten Galileo anywhere.
  • with regard to cognitive science, we're kind of pre-Galilean, just beginning to open up the subject
  • You can invent a world -- I don't think it's our world -- but you can invent a world in which nothing happens except random changes in objects and selection on the basis of external forces. I don't think that's the way our world works, I don't think it's the way any biologist thinks it is. There are all kind of ways in which natural law imposes channels within which selection can take place, and some things can happen and other things don't happen. Plenty of things that go on in the biology in organisms aren't like this. So take the first step, meiosis. Why do cells split into spheres and not cubes? It's not random mutation and natural selection; it's a law of physics. There's no reason to think that laws of physics stop there, they work all the way through. Well, they constrain the biology, sure. Chomsky: Okay, well then it's not just random mutation and selection. It's random mutation, selection, and everything that matters, like laws of physics.
  • What I think is valuable is the history of science. I think we learn a lot of things from the history of science that can be very valuable to the emerging sciences. Particularly when we realize that in say, the emerging cognitive sciences, we really are in a kind of pre-Galilean stage. We don't know wh
  • at we're looking for anymore than Galileo did, and there's a lot to learn from that.
9More

The Danger of Too Much Efficiency - NYTimes.com - 2 views

  • Each of these developments has made it easier to do one’s business without wasted time and energy — without friction. Each has made economic transactions quicker and more efficient. That’s obviously good, and that’s what Bain Capital tries to do in the companies it buys. You may employ a lazy brother-in-law who is not earning his keep. If you try to do something about it, you may encounter enormous friction — from your spouse. But if Bain buys you out, it won’t have any trouble at all getting rid of your brother-in-law and replacing him with someone more productive. This is what “creative destruction” is all about.
  • These are all situations in which a little friction to slow us down would have enabled both institutions and individuals to make better decisions. And in the case of individuals, there is the added bonus that using cash more and credit less would have made it apparent sooner just how much the “booming ’90s” had left the middle class behind. Credit hid the ever-shrinking purchasing power of the middle class from view.
  • e. If credit card companies weren’t allowed to charge outrageous interest, perhaps not everyone with a pulse would be offered credit cards. And if people had to pay with cash, rather than plastic, they might keep their hands in their pockets just a little bit longer.
  • ...4 more annotations...
  • All these examples tell us that increased efficiency is good, and that removing friction increases efficiency. But the financial crisis, along with the activities of the Occupy movement and the criticism being leveled at Mr. Romney, suggests that maybe there can be too much of a good thing. If loans weren’t securitized, bankers might have taken the time to assess the creditworthiness of each applicant. If homeowners had to apply for loans to improve their houses or buy new cars, instead of writing checks against home equity, they might have thought harder before making weighty financial commitments. If people actually had to go into a bank and stand in line to withdraw cash, they might spend a little less and save a little mor
  • Finding the “mean” isn’t easy, even when we try to. It is sometimes said that the only way to figure out how much is enough is by experiencing too much. But the challenge is even greater when we’re talking about companies, because companies aren’t even trying to find the “mean.” For an individual company and its shareholders, there is no such thing as too much efficiency. The price of too much efficiency is not paid by the company. It is what economists call a negative externality, paid by the people who lose their jobs and the communities that suffer from job loss. Thus, we can’t expect the free market to find the level of efficiency that keeps firms competitive, provides quality goods at affordable prices and sustains workers and their communities. If we are to find the balance, we must consider stakeholders and not just shareholders. Companies by themselves won’t do this. Sensible regulation might.
  • So the real criticism embodied by current attacks on Bain Capital is not a criticism of capitalism. It is a criticism of unbridled, single-minded capitalism. Capitalism needn’t be either of those things. It isn’t in other societies with high standards of living, and it hadn’t been historically in the United States. Perhaps we can use the current criticism of Bain Capital as an opportunity to bring a little friction back into our lives. One way to do this is to use regulation to rekindle certain social norms that serve to slow us down. For example, if people thought about their homes less as investments and more as places to live, full of the friction of kids, dogs, friends, neighbors and community organizations attached, there might be less speculation with an eye toward house-flipping. And if companies thought of themselves, at least partly, as caretakers of their communities, they might look differently at streamlining their operations.
  • We’d all like a car that gets 100 miles to the gallon. The forces of friction that slow us down are an expensive annoyance. But when we’re driving a car, we know where we’re going and we’re in control. Fast is good, though even here, a little bit of friction can forestall disaster when you encounter an icy road. Life is not as predictable as driving. We don’t always know where we’re going. We’re not always in control. Black ice is everywhere. A little something to slow us down in the uncertain world we inhabit may be a lifesaver.
  •  
    What do you think of his argument?
  •  
    How interesting! And persuasive, too. However, it also defies easy integration into the simplistic models that most of us use as foundations for our thinking about society, and particularly, in our normative thinking ("What *should* we do?"). So I expect that 3% of readers will share my initial intellectual appreciation of the argument, but 97% of those who do will quickly forget it.
14More

Meeting 'the Other' Face to Face - The New York Times - 0 views

  • Sitting in a conference room at a hotel near the Massachusetts Institute of Technology here, I slip on large headphones and an Oculus Rift virtual reality headset and wriggle into the straps of a backpack, weighed down with a computer and a battery.
  • when I stand, I quickly find myself in a featureless all-white room, a kind of Platonic vestibule. On the walls at either end are striking poster-size black-and-white portraits taken by the noted Belgian-Tunisian photographer Karim Ben Khelifa, one showing a young Israeli soldier and another a Palestinian fighter about the same age, whose face is almost completely hidden by a black hood.
  • Then the portraits disappear, replaced by doors, which open. In walk the two combatants — Abu Khaled, a fighter for the Popular Front for the Liberation of Palestine, and Gilad Peled, an Israeli soldier — seeming, except for a little pixelation and rigid body movement, like flesh-and-blood people who are actually in the room with me.
  • ...11 more annotations...
  • What he saw there was a culture of warfare that often perpetuated itself through misunderstanding and misinformation, with no mechanism for those of opposing sects or political forces to gain a sense of the enemy as a fellow human being.
  • “I began to think, ‘I’m meeting the same people over and over again,’” he said. “I’m seeing people I knew as kids, and now they’re grown-up fighters, in power, fighting the same fight. And you start to think about your work in terms of: ‘Am I helping to change anything? Am I having any impact?’ ”
  • “I thought of myself as a war illustrator. I started calling myself that.”
  • as a visiting artist at the university’s Center for Art, Science and Technology, he transformed what he initially conceived of as an unconventional photo and testimonial project involving fighters into a far more unconventional way of hearing and seeing his subjects, hoping to be able to engender a form of empathy beyond the reach of traditional documentary film
  • Then he and a small crew captured three-dimensional scans of the men and photographed them from multiple angles
  • He interviewed Mr. Khaled in Gaza and Mr. Peled in Tel Aviv, asking them the same six questions — basic ones like “Who’s your enemy and why?”; “What is peace for you?”; “Have you ever killed one of your enemies?”; “Where do you see yourself in 20 years?”
  • he began to build avatars of his interviewees and ways for them to move and respond inside a virtual world so realistic it makes even a 3-D movie seem like an artifact from the distant past. Mr. Harrell describes it as “long-form journalism in a totally new form.”
  • “You have something here you don’t have in any other form of journalism: body language.”
  • indeed, inside the world they have made, the power comes from the feeling of listening to the interviewees speak (you hear Mr. Ben Khelifa’s disembodied voice asking the questions, and the men’s voices answer, overlaid by the voice of an interpreter) as your body viscerally senses a person standing a few feet away from you, his eyes following yours as he talks, his chest rising and falling as he breathes.
  • Sofia Ayala, an M.I.T. sophomore, tested the project after I did and emerged — as I did — with a mesmerized flush on her face, a feeling of meeting someone not really there. “It makes it feel so much more personal than just reading about these things online,” she said. “When someone’s right there talking to you, you want to listen.”
  • “In many places I’ve been, you’re given your enemy when you’re born,” he said. “You grow up with this ‘other’ always out there. The best we can hope is that the ‘other’ will now be able to come into the same room with you for a while, where you can listen to him, and see him face to face.”
17More

Why I'm voting for Trump - CNNPolitics.com - 0 views

  • Trump is thriving, tapping into the fears and anxieties that have erupted into the open in an extraordinary presidential campaign.
  • Trump's nativist rhetoric and hardline immigration stance is a relief for those who see a segment of the population "getting away" with breaking the law
  • he has such deep loyalty among his supporters that he could "stand in the middle of 5th Avenue and shoot somebody and I wouldn't lose voters."
  • ...14 more annotations...
  • anti-establishment anger, and the racial and economic fears beneath it
  • he belief that Americans are unsafe, and he will protect them; an appreciation for the simple good vs. evil worldview he presents; an admiration of his celebrity status and business background
  • white frustration around race that Trump is tapping into
  • A majority of whites have a fundamentally different view of whether the federal government should ensure income equality between whites and minorities: 57% of whites said this was not the government's burden, but a majority of African-Americans (67%) and Hispanics (63%) said it was
  • he was tired of the so-called "new Americans" flooding the country.
  • You know this is bulls---- about black lives matter -- doesn't all lives matter?"
  • white Americans feel they are subjected to racial discriminatio
  • Almost half of whites -- 47% -- said in a November CNN/Kaiser Family Foundation survey that there is discrimination against whites, far more than the share of blacks and Hispanics who said the same
  • "I don't believe all Muslims are bad. But anybody can turn bad, and you've got to be able to locate them and know where they're at,"
  • "Islam is not a religion. It's a violent blood cult. OK?"
  • "All they know is violence, that's all they know."
  • "We can't look at a Muslim and tell if they're a terrorist or friendly."
  • And why have recent protests become increasingly ugly and even violent?
  • "Get 'em the hell out of here," Trump said, waving his hand dismissively.
20More

Why time seems to speed up as we get older - Vox - 0 views

  • As part of a lifelong experiment on circadian rhythms, Sothern, now 69, is trying to confirm or reject a widely held belief: Many people feel that time flies by more quickly as they age.
  • So far, Sothern's results are inconclusive
  • "I'm tending now to overestimate the minute more than I used to," he tells me. But then again, he had detected a similar pattern — more overestimates — in the 1990s, only to have his estimates fall in the 2000s. "Time estimation isn't a perfect science," he says.
  • ...17 more annotations...
  • There's very little scientific evidence to suggest our perception of time changes as we age. And yet, we consistently report that the past felt longer — that time is flying by faster as we age. What's going on?
  • Scientists can look at time estimation, or our ability to estimate how long a minute passes, compared with a clock. (This is what Sothern is doing.) They can also look at time awareness, or the broad feeling that time is moving quickly or slowly. Finally there's time perspective, the sense of a past, present, and future as constructed by our memories.
  • What researchers have found out is that while time estimation and time awareness don't change much as we age, time perspective does. In other words: Our memories create the illusion time is accelerating.
  • There weren't many differences between the old and the young. "[C]hronological age showed no systematic influence on the perception of these brief intervals of time up," the authors wrote. (That said, the researchers did find that males overestimate time while females underestimate it, perhaps due to having slightly different circadian clocks and therefore slightly different metabolic rates
  • Here, too, age seemed not to matter. Older people didn't seem to be aware of time passing any faster than younger people. The only question that yielded a statistically significant difference was, "How fast did the last decade pass?" Even there, the reported differences were tiny, and the effect appeared to plateau around age 50.
  • psychologists William Friedman and Steve Janssen found scant evidence that the subjective experience of time speeds up with age. They write in their 2009 paper, "We can concluded that when adults report on their general impressions of the speed of time, age differences are very small."
  • One possibility is that participants were simply biased by the (incorrect) conventional wisdom — they reported their later years as flying by more quickly because that's what everyday lore says should happen.
  • When people reflect back on their own life, they feel like their early years went by very slowly and their later years go by more quickly. This could be the source of the belief that time goes more quickly as they age.
  •  "Most people feel that time is currently passing faster for them than it did in the past," Janssen writes me in an email. "They have forgotten how they experienced the passage of time when they were younger."
  • We use significant events as signposts to gauge the passage of time. The fewer events, the faster time seems to go by.
  • Childhood is full of big, memorable moments like learning to ride a bike or making first friends. By contrast, adult life becomes ordinary and mechanized, and ambles along by.
  • Each passing year converts some of this experience into automatic routine which we hardly notice at all, the days and weeks smooth themselves out in recollection, and the years grow hollow and collapse.
  • Each new minute represents a smaller fraction of our lives. One day as a 10 year old represents about .027 percent of the kid's life. A day for a 60 year old? .0045 percent. The kid's life is just... bigger.
  • Also, our ability to recall events declines with age. If we can't remember a time, it didn't happen.
  • "[F]inding that there is insufficient time to get things done may be reinterpreted as the feeling that time is passing quickly," they write. Deadlines always come sooner than we'd like.
  • Psychologists have long understood the phenomenon called "forward telescoping" — i.e., our tendency to underestimate how long ago very memorable events occurred. "Because we know that memories fade over time, we use the clarity of a memory as a guide to its recency," science writer Claudia Hammond writes in her book Time Warped. "So if a memory seems unclear we assumed it happened longer ago." But very clear memories are assumed to be more recent.
  • If our memories can trick us into thinking time is moving quickly, then maybe there are ways to trick our brains into thinking that time is slowing down — such as committing to breaking routines and learning new things. You're more likely to remember learning how to skydive than watching another hour of mindless television.
15More

The Failure of Rational Choice Philosophy - NYTimes.com - 1 views

  • According to Hegel, history is idea-driven.
  • Ideas for him are public, rather than in our heads, and serve to coordinate behavior. They are, in short, pragmatically meaningful words.  To say that history is “idea driven” is to say that, like all cooperation, nation building requires a common basic vocabulary.
  • One prominent component of America’s basic vocabulary is ”individualism.”
  • ...12 more annotations...
  • individualism, the desire to control one’s own life, has many variants. Tocqueville viewed it as selfishness and suspected it, while Emerson and Whitman viewed it as the moment-by-moment expression of one’s unique self and loved it.
  • individualism as the making of choices so as to maximize one’s preferences. This differed from “selfish individualism” in that the preferences were not specified: they could be altruistic as well as selfish. It differed from “expressive individualism” in having general algorithms by which choices were made. These made it rational.
  • it was born in 1951 as “rational choice theory.” Rational choice theory’s mathematical account of individual choice, originally formulated in terms of voting behavior, made it a point-for-point antidote to the collectivist dialectics of Marxism
  • Functionaries at RAND quickly expanded the theory from a tool of social analysis into a set of universal doctrines that we may call “rational choice philosophy.” Governmental seminars and fellowships spread it to universities across the country, aided by the fact that any alternative to it would by definition be collectivist.
  • rational choice philosophy moved smoothly on the backs of their pupils into the “real world” of business and governme
  • Today, governments and businesses across the globe simply assume that social reality  is merely a set of individuals freely making rational choices.
  • At home, anti-regulation policies are crafted to appeal to the view that government must in no way interfere with Americans’ freedom of choice.
  • But the real significance of rational choice philosophy lay in ethics. Rational choice theory, being a branch of economics, does not question people’s preferences; it simply studies how they seek to maximize them. Rational choice philosophy seems to maintain this ethical neutrality (see Hans Reichenbach’s 1951 “The Rise of Scientific Philosophy,” an unwitting masterpiece of the genre); but it does not.
  • Whatever my preferences are, I have a better chance of realizing them if I possess wealth and power. Rational choice philosophy thus promulgates a clear and compelling moral imperative: increase your wealth and power!
  • Today, institutions which help individuals do that (corporations, lobbyists) are flourishing; the others (public hospitals, schools) are basically left to rot. Business and law schools prosper; philosophy departments are threatened with closure.
  • Hegel, for one, had denied all three of its central claims in his “Encyclopedia of the Philosophical Sciences” over a century before. In that work, as elsewhere in his writings, nature is not neatly causal, but shot through with randomness. Because of this chaos, we cannot know the significance of what we have done until our community tells us; and ethical life correspondingly consists, not in pursuing wealth and power, but in integrating ourselves into the right kinds of community.
  • By 1953, W. V. O. Quine was exposing the flaws in rational choice epistemology. John Rawls, somewhat later, took on its sham ethical neutrality, arguing that rationality in choice includes moral constraints. The neat causality of rational choice ontology, always at odds with quantum physics, was further jumbled by the environmental crisis, exposed by Rachel Carson’s 1962 book “The Silent Spring,” which revealed that the causal effects of human actions were much more complex, and so less predicable, than previously thought.
9More

New Prospects for Growing Human Replacement Organs in Animals - The New York Times - 0 views

  • For the first time, biologists have succeeded in growing human stem cells in pig embryos, shifting from science fiction to the realm of the possible the idea of developing human organs in animals for later transplant.
  • Since the organ would be made of a patient’s own cells, there would be little risk of immune rejection.
  • They would be generated by implanting human stem cells into an early pig embryo, resulting in an animal composed of mixed pig and human cells.
  • ...5 more annotations...
  • The two reports together establish the feasibility of trying to grow replacement human organs in animals, though such a goal is still far off.
  • Creating chimeras, especially those with human cells, may prove controversial, given the possibility that test animals could be humanized in undesirable ways. One would be if human cells should be incorporated into a pig’s brain, endowing it with human qualities. Almost no one wants a talking pig.
  • The ban is still in place, and it’s unclear whether the Trump administration would continue to consider lifting the moratorium or whether new objections would be raised to using public funds for this line of research.
  • But no one knows exactly what sequence of chemicals is required for the generation of each different tissue or organ. This may be why glassware experiments with stem cells have not yet lived up to their full promise.
  • Concern about human cells’ incorporation into a lower animal’s brain is not without basis. Dr. Steven Goldman of the University of Rochester Medical Center found in 2013 that mice injected with a special type of human brain cell had enhanced learning abilities.
  •  
    The ethics in biology is always a controversial issue in the research. In biology, we want to get as close to the truth as we possibly can, but that sometimes means that we have to research on things that are considered to be inhumane. This article talks about a new possibility that we can grow human stem cells in animals. That would lead to a potential problem of animals gaining intelligence. If animals start to have human properties, how would we treat them? Will they be a threat to our identity? And also, through the experiment described in the article, we can see that the scientific method in biology is dealing with probabilism and population. It is always gathering data. They are always ready for exceptions. --Sissi (1/28/2017)
12More

Does a Protest's Size Matter? - The New York Times - 1 views

  • The Women’s March on Saturday, which took place in cities and towns all across the United States (and around the world), may well have been the largest protest in American history. There were an estimated 3.5 million participants.
  • After studying protests over the last two decades, I have to deliver some bad news: In the digital age, the size of a protest is no longer a reliable indicator of a movement’s strength.
  • A protest does not have power just because many people get together in one place. Rather, a protest has power insofar as it signals the underlying capacity of the forces it represents.
  • ...8 more annotations...
  • Protesters are saying, in effect, “If we can pull this off, imagine what else we can do.”
  • The march drew a quarter of a million people, but it represented much more effort, commitment and preparation than would a protest of similar size today.
  • This is one reason that recent large protests have had less effect on policy than many were led to expect.
  • The protesters failed to transform into an electoral force capable of defeating him in the 2004 election.
  • Two enormous protests, two disappointing results. Similar sequences of events have played out in other parts of the world.
  • A large protest today is less like the March on Washington in 1963 and more like Rosa Parks’s refusal to move to the back of the bus. What used to be an endpoint is now an initial spark.
  • But the Tea Party protesters then got to work on a ferociously focused agenda: identifying and supporting primary candidates to challenge Republicans who did not agree with their demands, keeping close tabs on legislation and pressuring politicians who deviated from a Tea Party platform.
  • But there is no magic power to marching in the streets that, on its own, leads to any other kind of result.
  •  
    This article explains how protest work. I have always been thinking that protests are all about the number of people we can gather. The larger the population, the more powerful the protests are. However, I have never looked deep into the mechanism behind protests. I really like the analogy made in the article. The main purpose of a protest should be showing the potential strength the public have over the issues. If we don't do anything after the gathering, then the protest won't be power enough to influence the policy of the government because the government will know that we are actually not that firm on our position. The analogy I come up with is that our attendance can't reflect how much we learn in school. Attending the school doesn't ensure that we are taking away knowledge from school. Merely attending a protest doesn't mean we can put pressure on the government. --Sissi (1/29/2017)
16More

Social Media and the Devolution of Friendship: Full Essay (Pts I & II) » Cybo... - 1 views

  • social networking sites create pressure to put time and effort into tending weak ties, and how it can be impossible to keep up with them all. Personally, I also find it difficult to keep up with my strong ties. I’m a great “pick up where we left off” friend, as are most of the people closest to me (makes sense, right?). I’m decidedly sub-awesome, however, at being in constant contact with more than a few people at a time.
  • the devolution of friendship. As I explain over the course of this essay, I link the devolution of friendship to—but do not “blame” it on—the affordances of various social networking platforms, especially (but not exclusively) so-called “frictionless sharing” features.
  • I’m using the word here in the same way that people use it to talk about the devolution of health care. One example of devolution of health care is some outpatient surgeries: patients are allowed to go home after their operations, but they still require a good deal of post-operative care such as changing bandages, irrigating wounds, administering medications, etc. Whereas before these patients would stay in the hospital and nurses would perform the care-labor necessary for their recoveries, patients must now find their own caregivers (usually family members or friends; sometimes themselves) to perform free care-labor. In this context, devolution marks the shift of labor and responsibility away from the medical establishment and onto the patient; within the patient-medical establishment collaboration, the patient must now provide a greater portion of the necessary work. Similarly, in some ways, we now expect our friends to do a greater portion of the work of being friends with us.
  • ...13 more annotations...
  • Through social media, “sharing with friends” is rationalized to the point of relentless efficiency. The current apex of such rationalization is frictionless sharing: we no longer need to perform the labor of telling our individual friends about what we read online, or of copy-pasting links and emailing them to “the list,” or of clicking a button for one-step posting of links on our Facebook walls. With frictionless sharing, all we have to do is look, or listen; what we’ve read or watched or listened to is then “shared” or “scrobbled” to our Facebook, Twitter, Tumblr, or whatever other online profiles. Whether we share content actively or passively, however, we feel as though we’ve done our half of the friendship-labor by ‘pushing’ the information to our walls, streams, and tumblelogs. It’s then up to our friends to perform their halves of the friendship-labor by ‘pulling’ the information we share from those platforms.
  • We’re busy people; we like the idea of making one announcement on Facebook and being done with it, rather than having to repeat the same story over and over again to different friends individually. We also like not always having to think about which friends might like which stories or songs; we like the idea of sharing with all of our friends at once, and then letting them sort out amongst themselves who is and isn’t interested. Though social media can create burdensome expectations to keep up with strong ties, weak ties, and everyone in between, social media platforms can also be very efficient. Using the same moment of friendship-labor to tend multiple friendships at once kills more birds with fewer stones.
  • sometimes we like the devolution of friendship. When we have to ‘pull’ friendship-content instead of receiving it in a ‘push’, we can pick and choose which content items to pull. We can ignore the baby pictures, or the pet pictures, or the sushi pictures—whatever it is our friends post that we only pretend to care about
  • I’ve been thinking since, however, on what it means to view our friends as “generalized others.” I may now feel like less of like “creepy stalker” when I click on a song in someone’s Spotify feed, but I don’t exactly feel ‘shared with’ either. Far as I know, I’ve never been SpotiVaguebooked (or SubSpotified?); I have no reason to think anyone is speaking to me personally as they listen to music, or as they choose not to disable scrobbling (if they make that choice consciously at all). I may have been granted the opportunity to view something, but it doesn’t follow that what I’m viewing has anything to do with me unless I choose to make it about me. Devolved friendship means it’s not up to us to interact with our friends personally; instead it’s now up to our friends to make our generalized broadcasts personal.
  • While I won’t go so far as to say they’re definitely ‘problems,’ there are two major things about devolved friendship that I think are worth noting. The first is the non-uniform rationalization of friendship-labor, and the second is the depersonalization of friendship-labor.
  • In short, “sharing” has become a lot easier and a lot more efficient, but “being shared with” has become much more time-consuming, demanding, and inefficient (especially if we don’t ignore most of our friends most of the time). Given this, expecting our friends to keep up with our social media content isn’t expecting them to meet us halfway; it’s asking them to take on the lion’s share of staying in touch with us. Our jobs (in this role) have gotten easier; our friends’ jobs have gotten harder.
  • The second thing worth noting is that devolved friendship is also depersonalized friendship.
  • Personal interaction doesn’t just happen on Spotify, and since I was hoping Spotify would be the New Porch, I initially found Spotify to be somewhat lonely-making. It’s the mutual awareness of presence that gives companionate silence its warmth, whether in person or across distance. The silence within Spotify’s many sounds, on the other hand, felt more like being on the outside looking in. This isn’t to say that Spotify can’t be social in a more personal way; once I started sending tracks to my friends, a few of them started sending tracks in return. But it took a lot more work to get to that point, which gets back to the devolution of friendship (as I explain below).
  • Within devolved friendship interactions, it takes less effort to be polite while secretly waiting for someone to please just stop talking.
  • When we consider the lopsided rationalization of ‘sharing’ and ‘shared with,’ as well as the depersonalization of frictionless sharing and generalized broadcasting, what becomes clear is this: the social media deck is stacked in such a way as to make being ‘a self’ easier and more rewarding than being ‘a friend.’
  • It’s easy to share, to broadcast, to put our selves and our tastes and our identity performances out into the world for others to consume; what feedback and friendship we get in return comes in response to comparatively little effort and investment from us. It takes a lot more work, however, to do the consumption, to sift through everything all (or even just some) of our friends produce, to do the work of connecting to our friends’ generalized broadcasts so that we can convert their depersonalized shares into meaningful friendship-labor.
  • We may be prosumers of social media, but the reward structures of social media sites encourage us to place greater emphasis on our roles as share-producers—even though many of us probably spend more time consuming shared content than producing it. There’s a reason for this, of course; the content we produce (for free) is what fuels every last ‘Web 2.0’ machine, and its attendant self-centered sociality is the linchpin of the peculiarly Silicon Valley concept of “Social” (something Nathan Jurgenson and I discuss together in greater detail here). It’s not super-rewarding to be one of ten people who “like” your friend’s shared link, but it can feel rewarding to get 10 “likes” on something you’ve shared—even if you have hundreds or thousands of ‘friends.’ Sharing is easy; dealing with all that shared content is hard.
  • t I wonder sometimes if the shifts in expectation that accompany devolved friendship don’t migrate across platforms and contexts in ways we don’t always see or acknowledge. Social media affects how we see the world—and how we feel about being seen in the world—even when we’re not engaged directly with social media websites. It’s not a stretch, then, to imagine that the affordances of social media platforms might also affect how we see friendship and our obligations as friends most generally.
2More

A Crush on God | Commonweal magazine - 0 views

  • Ignatius taught the Jesuits to end each day doing something called the Examen. You start by acknowledging that God is there with you; then you give thanks for the good parts of your day (mine usually include food); and finally, you run through the events of the day from morning to the moment you sat down to pray, stopping to consider when you felt consolation, the closeness of God, or desolation, when you ignored God or when you felt like God bailed on you. Then you ask for forgiveness for anything shitty you did, and for guidance tomorrow. I realize I’ve spent most of my life saying “thanks” to people in a perfunctory, whatever kind of way. Now when I say it I really mean it, even if it’s to the guy who makes those lattes I love getting in the morning, because I stopped and appreciated his latte-making skills the night before. If you are lucky and prone to belief, the Examen will also help you start really feeling God in your life.
  • My church hosts a monthly dinner for the homeless. Serious work is involved; volunteers pull multiple shifts shopping, prepping, cooking, serving food, and cleaning. I show up for the first time and am shuttled into the kitchen by a harried young woman with a pen stuck into her ponytail, who asks me if I can lift heavy weights before putting me in front of two bins of potato salad and handing me an ice cream scoop. For three hours, I scoop potato salad onto plates, heft vats of potato salad, and scrape leftover potato salad into the compost cans. I never want to eat potato salad again, but I learn something about the homeless people I’ve been avoiding for years: some are mentally a mess, many—judging from the smell—are drunk off their asses, but on the whole, they are polite, intelligent, and, more than anything else, grateful. As I walk back to my car, I’m stopped several times by many of them who want to thank me, saying how good the food was, how much they enjoyed it. “I didn’t do anything,” I say in return. “You were there,” one of them replies. It’s enough to make me go back the next month, and the month after that. And in between, when I see people I feed on the street, instead of focusing my eyes in the sidewalk and hoping they go away, we have conversations. It’s those conversations that move me from intellectual distance toward a greater sense of gratitude for the work of God.
13More

Living in the Material World - NYTimes.com - 0 views

  • on a visit to the Academy of Sciences in Almaty some years ago I was presented with a souvenir meant to assure me that Central Asia was indeed still producing philosophy worthy of note. It was a collectively authored book entitled “The Development of Materialist Dialectics in Kazakhstan,” and I still display it proudly on my shelf. Its rough binding and paper bespeak economic hardship. It is packed with the traces of ideas, yet everything about the book announces its materiality.I had arrived in the Kazakh capital 1994, just in time to encounter the last of a dying breed: the philosopher as party functionary (they are all by now retired, dead or defenestrated, or have simply given up on what they learned in school). The book, written by committee, was a collection of official talking points, and what passed for conversation there was something much closer to recitation.
  • The philosophical meaning of materialism may in the final analysis be traced back to a religious view of the world. On this view, to focus on the material side of existence is to turn away from the eternal and divine. Here, the category of the material is assimilated to that of sin or evil.
  • Yet in fact this feature of Marxist philosophical classification is one that, with some variations, continues to be shared by all philosophers, even in the West, even today
  • ...9 more annotations...
  • materialism is not the greedy desire for material goods, but rather the belief that the fundamental reality of the world is material;
  • idealism is not the aspiration toward lofty and laudable goals, but rather the belief that the fundamental reality of the world is mental or idea-like. English-speaking philosophers today tend to speak of “physicalism” or “naturalism” rather than materialism (perhaps to avoid confusion with the Wall Street sense of the term). At the same time, Anglo-American historians of philosophy continue to find the distinction between materialism and idealism a useful one in our attempts at categorizing past schools of thought. Democritus and La Mettrie were materialists; Hobbes was pretty close. Berkeley and Kant were idealists; Leibniz may have been.
  • And it was these paradoxes that led the Irish philosopher to conclude that talk of matter was but a case of multiplying entities beyond necessity. For Berkeley, all we can know are ideas, and for this reason it made sense to suppose that the world itself consists in ideas.
  • Soviet and Western Marxists alike, by stark contrast, and before them the French “vulgar” (i.e., non-dialectical) materialists of the 18th century, saw and see the material world as the base and cause of all mental activity, as both bringing ideas into existence, and also determining the form and character of a society’s ideas in accordance with the state of its technology, its methods of resource extraction and its organization of labor. So here to focus on the material is not to become distracted from the true source of being, but rather to zero right in on it.
  • one great problem with the concept of materialism is that it says very little in itself. What is required in addition is an elaboration of what a given thinker takes matter, or ideas, to be. It may not be just the Marxist aftertaste, but also the fact that the old common-sense idea about matter as brute, given stuff has turned out to have so little to do with the way the physical world actually is, that has led Anglo-American philosophers to prefer to associate themselves with the “physical” or the “natural” rather than with the material.  Reality, they want to say, is just what is natural, while everything else is in turn “supernatural” (this distinction has its clarity going for it, but it also seems uncomfortably close to tautology). Not every philosopher has a solid grasp of subatomic physics, but most know enough to grasp that, even if reality is eventually exhaustively accounted for through an enumeration of the kinds of particles and a few basic forces, this reality will still look nothing like what your average person-in-the-street takes reality to be.
  • The 18th-century idealist philosopher George Berkeley strongly believed that matter was only a fiction contrived by philosophers in the first place, for which the real people had no need. For Berkeley, there was never anything common-sensical about matter. We did not need to arrive at the era of atom-splitting and wave-particle duality, then, in order for the paradoxes inherent in matter to make themselves known (is it infinitely divisible or isn’t it?
  • Central to this performance was the concept of  “materialism.” The entire history of philosophy, in fact, was portrayed in Soviet historiography as a series of matches between the materialist home-team and its “idealist” opponents, beginning roughly with Democritus (good) and Plato (bad), and culminating in the opposition between official party philosophy and logical positivism, the latter of which was portrayed as a shrouded variety of idealism. Thus from the “Short Philosophical Dictionary,” published in Moscow in 1951, we learn that the school of logical empiricism represented by Rudolf Carnap, Otto Neurath and others, “is a form of subjective idealism, characteristic of degenerating bourgeois philosophy in the epoch of the decline of capitalism.”Now the Soviet usage of this pair of terms appears to fly in the face of our ordinary, non-philosophical understanding of them (that, for example,  Wall Street values are “materialist,” while the Occupy movement is “idealist”). One might have thought that the communists should be flinging the “materialist” label at their capitalist enemies, rather than claiming it for themselves. One might also have thought that the Bolshevik Revolution and the subsequent failed project of building a workers’ utopia was nothing if not idealistic.
  • Consider money. Though it might sometimes be represented by bank notes or coins, money is an immaterial thing par excellence, and to seek to acquire it is to move on the plane of ideas. Of course, money can also be converted into material things, yet it seems simplistic to suppose that we want money only in order to convert it into the material things we really want, since even these material things aren’t just material either: they are symbolically dense artifacts, and they convey to others certain ideas about their owners. This, principally, is why their owners want them, which is to say that materialists (in the everyday sense) are trading in ideas just as much as anyone else.
  • In the end no one really cares about stuff itself. Material acquisitions — even, or perhaps especially, material acquisitions of things like Rolls Royces and Rolexes — are maneuvers within a universe of materially instantiated ideas. This is human reality, and it is within this reality that mystics, scientists, and philosophers alike are constrained to pursue their various ends, no matter what they might take the ultimate nature of the external world to be.
  •  
    A very interesting article on the contrast between materialism and idealism.
11More

The Philosopher Whose Fingerprints Are All Over the FTC's New Approach to Privacy - Ale... - 0 views

  • The standard explanation for privacy freakouts is that people get upset because they've "lost control" of data about themselves or there is simply too much data available. Nissenbaum argues that the real problem "is the inapproproriateness of the flow of information due to the mediation of technology." In her scheme, there are senders and receivers of messages, who communicate different types of information with very specific expectations of how it will be used. Privacy violations occur not when too much data accumulates or people can't direct it, but when one of the receivers or transmission principles change. The key academic term is "context-relative informational norms." Bust a norm and people get upset.
  • Nissenbaum gets us past thinking about privacy as a binary: either something is private or something is public. Nissenbaum puts the context -- or social situation -- back into the equation. What you tell your bank, you might not tell your doctor.
  • Furthermore, these differences in information sharing are not bad or good; they are just the norms.
  • ...8 more annotations...
  • any privacy regulation that's going to make it through Congress has to provide clear ways for companies to continue profiting from data tracking. The key is coming up with an ethical framework in which they can do so, and Nissenbaum may have done just that. 
  • The traditional model of how this works says that your information is something like a currency and when you visit a website that collects data on you for one reason or another, you enter into a contract with that site. As long as the site gives you "notice" that data collection occurs -- usually via a privacy policy located through a link at the bottom of the page -- and you give "consent" by continuing to use the site, then no harm has been done. No matter how much data a site collects, if all they do is use it to show you advertising they hope is more relevant to you, then they've done nothing wrong.
  • let companies do standard data collection but require them to tell people when they are doing things with data that are inconsistent with the "context of the interaction" between a company and a person.
  • How can anyone make a reasonable determination of how their information might be used when there are more than 50 or 100 or 200 tools in play on a single website in a single month?
  • Nissenbaum doesn't think it's possible to explain the current online advertising ecosystem in a useful way without resorting to a lot of detail. She calls this the "transparency paradox," and considers it insoluble.
  • she wants to import the norms from the offline world into the online world. When you go to a bank, she says, you have expectations of what might happen to your communications with that bank. That should be true whether you're online, on the phone, or at the teller.  Companies can use your data to do bank stuff, but they can't sell your data to car dealers looking for people with a lot of cash on hand.
  • Nevermind that if you actually read all the privacy policies you encounter in a year, it would take 76 work days. And that calculation doesn't even account for all the 3rd parties that drain data from your visits to other websites. Even more to the point: there is no obvious way to discriminate between two separate webpages on the basis of their data collection policies. While tools have emerged to tell you how many data trackers are being deployed at any site at a given moment, the dynamic nature of Internet advertising means that it is nearly impossible to know the story through time
  • here's the big downside: it rests on the "norms" that people expect. While that may be socially optimal, it's actually quite difficult to figure out what the norms for a given situation might be. After all, there is someone else who depends on norms for his thinking about privacy.
4More

Does Facebook Turn People Into Narcissists? - NYTimes.com - 0 views

  • Those who frequently updated their Facebook status, tagged themselves in photos and had large numbers of virtual friends, were more likely to exhibit narcissistic traits, the study found. Another study found that people with high levels of narcissism were more likely to spend more than an hour a day on Facebook, and they were also more likely to post digitally enhanced personal photos. But what the research doesn’t answer is whether Facebook attracts narcissists or turns us into them.
  • researchers found, to their surprise, that frequency of Facebook use, whether it was for personal status updates or to connect with friends, was not associated with narcissism. Narcissism per se was associated with only one type of Facebook user — those who amassed unrealistically large numbers of Facebook friends.
  • frequent Facebook users were more likely to score high on “openness” and were less concerned about privacy. So what seems like self-promoting behavior may just reflect a generation growing up in the digital age, where information — including details about personal lives — flows freely and connects us.
  • ...1 more annotation...
  • The social medium of choice for the self-absorbed appears to be Twitter. The researchers found an association between tweeting about oneself and high narcissism scores.
12More

Chris Hayes Has Arrived With 'Up' - NYTimes.com - 0 views

  • In less than a year on television (and with a chirpy voice, a weakness for gesticulation and a tendency to drop honors-thesis words like “signifier” into casual conversation), Mr. Hayes has established himself as Generation Y’s wonk prince of the morning political talk-show circuit.
  • “He is never doctrinaire,” Mr. Leo said in an interview. Both punk fans and “Up” fans are “suspicious of any authority,” he said, and appreciate that Mr. Hayes “is always willing to challenge his own assumptions, and the received wisdom on both sides of the aisle.”
  • Social media, in fact, have played an unusually important role in driving traffic to the program, an MSNBC spokeswoman said. About 45 percent of the visitors to the program’s Web site, which contains complete episodes, linked through sites like Facebook and Twitter. In April, those users spent an average of 51 minutes on the site each visit.
  • ...9 more annotations...
  • “Up” comes off as a rebuke to traditional cable shout-fests like CNN’s late “Crossfire.” Thanks to its early weekend time slot, the program has the freedom to unwind over two hours each Saturday and Sunday. Guests are encouraged to go deep into the issues of the week, and not try to score cheap-shot points to win the debate.
  • “The first and foremost important rule of the show: we’re not on television — no talking points, no sound bites,” he said, his hair still a bed-head tangle and his suit collar askew. “We have a lot of time for actual conversation. So actually listen, actually respond.”
  • An hour later, as the cameras rolled, Mr. Hayes and his guests waded thigh-deep into an analysis of private equity and whether it is bad for the economy. At a table of wonks, Mr. Hayes, who studied the philosophy of mathematics at Brown, came off as the wonkiest as he deconstructed the budgetary implications of tax arbitrage. Opinions were varied and passionate, but there was no sniping, no partisan grandstanding.
  • “I like t
  • he fact that it’s dialogic, small-d ‘democratic,’ ” Mr. Hayes said of his show. “We’re all sitting at t
  • Since Dec. 26, it has been No. 1 on average in its Sunday time slot on cable news channels among viewers ages 18 to 34, according to Nielsen figures provided by the network.
  • Ms. Maddow said on her program that “Up” was “the best news show on TV, including this one.” “Chris is the antidote to the anti-intellectual posing that has characterized the last decade in cable news,”
  • “No one else in cable is even trying long-form, off-the-news-cycle dives like him — let alone succeeding at them as he is. He’s giving the network Sunday shows a run for their money.”
  • As a student at Hunter College High School in Manhattan, he aspired to write. “My dream when I was 14,” he said, “was someday I could have a David Levine caricature of me in The New York Review of Books.”
19More

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
« First ‹ Previous 101 - 120 of 1578 Next › Last »
Showing 20 items per page