Skip to main content

Home/ TOK Friends/ Group items tagged mental images

Rss Feed Group items tagged

caelengrubb

Anxiety, loneliness and Fear of Missing Out: The impact of social media on young people... - 0 views

  • By 2021, it is forecast that there will be around 3 billion active monthly users of social media. From the statistics alone, it’s clear that social media has become an integral (and to a large extent, unavoidable) part of our lives.
  • One implication of social media’s rapid rise, that of its relationship with young people’s mental health, has gathered a significant amount of attention in recent years.
  • So-called ‘social media addiction’ has been referred to by a wide variety of studies and experiments. It is thought that addiction to social media affects around 5% of young people, and was recently described as potentially more addictive than alcohol and cigarettes
  • ...8 more annotations...
  • The ‘urge’ to check one’s social media may be linked to both instant gratification (the need to experience fast, short term pleasure) and dopamine production (the chemical in the brain associated with reward and pleasure).
  • The popular concept of Fear of Missing Out (FOMO) refers to ‘a pervasive apprehension that others might be having rewarding experiences from which one is absent’ and is ‘characterised by the desire to stay continually connected with what others are doing’.
  • What is dangerous about this compulsive use is that, if gratification is not experienced, users may internalise beliefs that this is due to being ‘unpopular’, ‘unfunny’ etc. A lack of ‘likes’ on a status update may cause negative self-reflection, prompting continual ‘refreshing’ of the page in the hope of seeing that another person has ‘enjoyed’ the post, thus helping to achieve personal validation.
  • The University of Glasgow found that young people found it difficult to relax following night time social media use, reducing their brain’s ability to prepare for sleep. Sleep loss works in a vicious cycle of reinforcement with mental health; that is, that loss of sleep due to night time social media use can lead to poorer mental health, and poor mental health can lead to intense night time use and sleep loss
  • Data from qualitative studies has shown that using social media compulsively can damage sleeping patterns, having an adverse effect on young people’s performance in school
  • FOMO has been linked to intensive social media use and is associated with lower mood and life satisfaction.
  • Social media has been linked to poor self-esteem and self-image through the advent of image manipulation on photo sharing platforms. In particular, the notion of the ‘idealised body image’ has arguably been detrimental to self-esteem and image, especially that of young women. The 24/7 circulation of easily viewable manipulated images promotes and entrenches unrealistic expectations of how young people should look and behave.
  • The evidence suggests that social media use is strongly associated with anxiety, loneliness and depression
Javier E

Kids and Social Media: a Mental Health Crisis or Moral Panic? - 0 views

  • given the range of evidence and the fact that the biggest increases relate to a specific group (teenage girls) and a specific set of issues clustered around anxiety and body image I would assign a high probability to it being a real issue. Especially as it fits the anecdotal conservations I have with headteachers and parents.
  • Is social media the cause?
  • One of the most commonly identified culprits is social media. Until recently I’ve been sceptical for two reasons. First I’m allergic to moral panics.
  • ...8 more annotations...
  • Secondly as Stuart Ritchie points out in this excellent article, to date the evidence assembled by proponents of the social media theory like Jonathan Haidt and Jean Twenge, has shown correlations not causal relationships. Yes, it seems that young people who use social media a lot have worse mental health, but that could easily be because young people with worse mental health choose to use social media more!  
  • recently I’ve shifted to thinking it probably is a major cause for three reasons:
  • 1.       I can’t think of anything else that fits. Other suggested causes just don’t work.
  • Social media does fit, the big increase in take up maps well on to the mental health data and it happened everywhere in rich countries at the same time. The most affected group, teenage girls, are also the ones who report that social media makes them more anxious and body conscious in focus groups
  • It is of course true that correlation doesn’t prove anything but if there’s only one strongly related correlation it’s pretty likely there’s a relationship.
  • 2.       There is no doubt that young people are spending a huge amount of time online now. And that, therefore, must have replaced other activities that involve being out with friends in real life. Three quarters of 12 year olds now have a social media profile and 95% of teenagers use social media regularly. Over half who say they’ve been bullied, say it was on social media.
  •   We finally have the first evidence of a direct causal relationship via a very clever US study using the staged rollout of Facebook across US college campuses to assess the impact on mental health. Not only does it show that mental illness increased after the introduction of Facebook but it also shows that it was particularly pronounced amongst those who were more likely to view themselves unfavourably alongside their peers due to being e.g. overweight or having lower socio-economic status. It is just one study but it nudges me even further towards thinking this a major cause of the problem.
  • I have blocked my (12 year old) twins from all social media apps and will hold out as long as possible. The evidence isn’t yet rock solid but it’s solid enough to make me want to protect them as best I can.
pier-paolo

Computers Already Learn From Us. But Can They Teach Themselves? - The New York Times - 0 views

  • We teach computers to see patterns, much as we teach children to read. But the future of A.I. depends on computer systems that learn on their own, without supervision, researchers say.
  • When a mother points to a dog and tells her baby, “Look at the doggy,” the child learns what to call the furry four-legged friends. That is supervised learning. But when that baby stands and stumbles, again and again, until she can walk, that is something else.Computers are the same.
  • ven if a supervised learning system read all the books in the world, he noted, it would still lack human-level intelligence because so much of our knowledge is never written down.
  • ...9 more annotations...
  • upervised learning depends on annotated data: images, audio or text that is painstakingly labeled by hordes of workers. They circle people or outline bicycles on pictures of street traffic. The labeled data is fed to computer algorithms, teaching the algorithms what to look for. After ingesting millions of labeled images, the algorithms become expert at recognizing what they have been taught to see.
  • There is also reinforcement learning, with very limited supervision that does not rely on training data. Reinforcement learning in computer science,
  • is modeled after reward-driven learning in the brain: Think of a rat learning to push a lever to receive a pellet of food. The strategy has been developed to teach computer systems to take actions.
  • My money is on self-supervised learning,” he said, referring to computer systems that ingest huge amounts of unlabeled data and make sense of it all without supervision or reward. He is working on models that learn by observation, accumulating enough background knowledge that some sort of common sense can emerge.
  • redict outcomes and choose a course of action. “Everybody agrees we need predictive learning, but we disagree about how to get there,”
  • A more inclusive term for the future of A.I., he said, is “predictive learning,” meaning systems that not only recognize patterns but also p
  • A huge fraction of what we do in our day-to-day jobs is constantly refining our mental models of the world and then using those mental models to solve problems,” he said. “That encapsulates an awful lot of what we’d like A.I. to do.”Image
  • Currently, robots can operate only in well-defined environments with little variation.
  • “Our working assumption is that if we build sufficiently general algorithms, then all we really have to do, once that’s done, is to put them in robots that are out there in the real world doing real things,”
Javier E

How Inequality Hollows Out the Soul - NYTimes.com - 0 views

  • Now that we can compare robust data for different countries, we can see not only what we knew intuitively — that inequality is divisive and socially corrosive — but that it also damages the individual psyche.
  • Our tendency to equate outward wealth with inner worth invokes deep psychological responses, feelings of dominance and subordination, superiority and inferiority. This affects the way we see and treat one another.
  • To compare mental illness rates internationally, the World Health Organization asked people in each country about their mood, tiredness, agitation, concentration, sleeping patterns and self-confidence. These have been found to be good indicators of mental illness.
  • ...12 more annotations...
  • in developed countries, major and minor mental illnesses were three times as common in societies where there were bigger income differences between rich and poor. In other words, an American is likely to know three times as many people with depression or anxiety problems as someone in Japan or Germany.
  • One, looking at the 50 American states, discovered that after taking account of age, income and educational differences, depression was more common in states with greater income inequality
  • schizophrenia was about three times as common in more unequal societies as it was in more equal ones.
  • a wide range of mental disorders might originate in a “dominance behavioral system.” This part of our evolved psychological makeup, almost universal in mammals, enables us to recognize and respond to social ranking systems based on hierarchy and power. One brain-imaging study discovered that there were particular areas of the brain and neural mechanisms dedicated to processing social rank.
  • psychiatric conditions like mania and narcissism are related to our striving for status and dominance, while disorders such as anxiety and depression may involve responses to the experience of subordination
  • how does increasing inequality factor in? One of the important effects of wider income differences between rich and poor is to intensify the issues of dominance and subordination, and feelings of superiority and inferiority.
  • A new study by Dublin-based researchers of 34,000 people in 31 countries found that in countries with bigger income differences, status anxiety was more common at all levels in the social hierarchy
  • self-enhancement or self-aggrandizement — the tendency to present an inflated view of oneself — occurred much more frequently in more unequal societies.
  • In the United States, research psychologists have shown that narcissism rates, as measured by a standard academic tool known as the Narcissistic Personality Inventory, rose rapidly from the later 1980s, which would appear to track the increases in inequality
  • as larger differences in material circumstances create greater social distances, feelings of superiority and inferiority increase. In short, growing inequality makes us all more neurotic about “image management” and how we are seen by others.
  • Humans instinctively know how to cooperate and create social ties, but we also know how to engage in status competition — how to be snobs and how to talk ourselves up. We use these alternative social strategies almost every day of our lives, but crucially, inequality shifts the balance between them.
  • we become less nice people in more unequal societies. But we are less nice and less happy: Greater inequality redoubles status anxiety, damaging our mental health and distorting our personalities — wherever we are on the social spectrum.
kushnerha

Aphantasia: A life without mental images - BBC News - 0 views

  • Most people can readily conjure images inside their head - known as their mind's eye. But this year scientists have described a condition, aphantasia, in which some people are unable to visualise mental images.
  • Our memories are often tied up in images, think back to a wedding or first day at school. As a result, Niel admits, some aspects of his memory are "terrible", but he is very good at remembering facts. And, like others with aphantasia, he struggles to recognise faces.Yet he does not see aphantasia as a disability, but simply a different way of experiencing life.
  • "When I think about my fiancee there is no image, but I am definitely thinking about her, I know today she has her hair up at the back, she's brunette."But I'm not describing an image I am looking at, I'm remembering features about her, that's the strangest thing and maybe that is a source of some regret."The response from his mates is a very sympathetic: "You're weird."
  • ...3 more annotations...
  • One person who took part in a study into aphantasia said he had started to feel "isolated" and "alone" after discovering that other people could see images in their heads. Being unable to reminisce about his mother years after her death led to him being "extremely distraught".
  • Adam Zeman, a professor of cognitive and behavioural neurology, wants to compare the lives and experiences of people with aphantasia and its polar-opposite hyperphantasia.His team, based at the University of Exeter, coined the term aphantasia this year
  • How we imagine is clearly very subjective - one person's vivid scene could be another's grainy picture.But Prof Zeman is certain that aphantasia is real. People often report being able to dream in pictures, and there have been reported cases of people losing the ability to think in images after a brain injury. He is adamant that aphantasia is "not a disorder" and says it may affect up to one in 50 people.
Javier E

Secrets of a Mind-Gamer - NYTimes.com - 0 views

  • “What you have to understand is that even average memories are remarkably powerful if used properly,” Cooke said. He explained to me that mnemonic competitors saw themselves as “participants in an amateur research program” whose aim is to rescue a long-lost tradition of memory training.
  • it wasn’t so long ago that culture depended on individual memories. A trained memory was not just a handy tool but also a fundamental facet of any worldly mind. It was considered a form of character-building, a way of developing the cardinal virtue of prudence and, by extension, ethics. Only through memorizing, the thinking went, could ideas be incorporated into your psyche and their values absorbed.
  • all the other mental athletes I met kept insisting that anyone could do what they do. It was simply a matter of learning to “think in more memorable ways,” using a set of mnemonic techniques almost all of which were invented in ancient Greece. These techniques existed not to memorize useless information like decks of playing cards but to etch into the brain foundational texts and ideas.
  • ...10 more annotations...
  • not only did the brains of the mental athletes appear anatomically indistinguishable from those of the control subjects, but on every test of general cognitive ability, the mental athletes’ scores came back well within the normal range.
  • There was, however, one telling difference between the brains of the mental athletes and those of the control subjects. When the researchers looked at the parts of the brain that were engaged when the subjects memorized, they found that the mental athletes were relying more heavily on regions known to be involved in spatial memory.
  • just about anything could be imprinted upon our memories, and kept in good order, simply by constructing a building in the imagination and filling it with imagery of what needed to be recalled. This imagined edifice could then be walked through at any time in the future. Such a building would later come to be called a memory palace.
  • Memory training was considered a centerpiece of classical education in the language arts, on par with grammar, logic and rhetoric. Students were taught not just what to remember but how to remember it. In a world with few books, memory was sacrosanct.
  • In his essay “First Steps Toward a History of Reading,” Robert Darnton describes a switch from “intensive” to “extensive” reading that occurred as printed books began to proliferate.
  • Until relatively recently, people read “intensively,” Darnton says. “They had only a few books — the Bible, an almanac, a devotional work or two — and they read them over and over again, usually aloud and in groups, so that a narrow range of traditional literature became deeply impressed on their consciousness.” Today we read books “extensively,” often without sustained focus, and with rare exceptions we read each book only once. We value quantity of reading over quality of reading.
  • “Rhetorica ad Herennium” underscores the importance of purposeful attention by making a distinction between natural memory and artificial memory:
  • Our hunter-gatherer ancestors didn’t need to recall phone numbers or word-for-word instructions from their bosses or the Advanced Placement U.S. history curriculum or (because they lived in relatively small, stable groups) the names of dozens of strangers at a cocktail party. What they did need to remember was where to find food and resources and the route home and which plants were edible and which were poisonous
  • What distinguishes a great mnemonist, I learned, is the ability to create lavish images on the fly, to paint in the mind a scene so unlike any other it cannot be forgotten. And to do it quickly. Many c
  • the three stages of acquiring a new skill. During the first phase, known as the cognitive phase, we intellectualize the task and discover new strategies to accomplish it more proficiently. During the second
Javier E

Anxiety and Depression Are on an 80-Year Upswing -- Science of Us - 1 views

  • Ever since the 1930s, young people in America have reported feeling increasingly anxious and depressed. And no one knows exactly why.One of the researchers who has done the most work on this subject is Dr. Jean Twenge, a social psychologist at San Diego State University who is the author of Generation Me: Why Today’s Young Americans Are More Confident, Assertive, Entitled—and More Miserable Than Ever Before. She’s published a handful of articles on this trajectory, and the underlying story, she thinks, is a rather negative one. “I think the research tells us that modern life is not good for mental health,” she said.
  • The words “depression” and “anxiety” themselves, after all, mean very different things to someone asked about them in 1935 as compared to 1995, so surveys that invoke these concepts directly only have limited utility for longitudinal study. To get around this, Twenge prefers to rely on surveys and inventories in which respondents are asked about specific symptoms which are frequently correlated with anxiety and depression
  • Much of the richest data on this question, then, comes from the Minnesota Multiphasic Personality Inventory (MMPI), which has been administered to high school and college students since the 1930s — and which includes many questions about symptoms. Specifically, it asks — among many other things — whether respondents feel well-rested when they wake up, whether they have trouble thinking, and whether they have experienced dizzy spells, headaches, shortness of breath, a racing heart, and so on.
  • ...12 more annotations...
  • The trendlines are obvious: Asked the same questions at about the same points in their lives, Americans are, over time, experiencing worse and worse symptoms associated with anxiety and depression.
  • there’s an interesting recent wrinkle to this trajectory. In a paper published in 2014 in Social Indicators Research, Twenge tracked the results of the Monitoring the Future (MtF) survey, “a nationally representative sample of U.S. 12th graders [administered] every year since 1976,” between 1982 and 2013. Like the MMPI, the MtF asks students about symptoms in a manner that should be generally resistant to cultural change: The somatic items Twenge examined asked about trouble sleeping, remembering things, thinking/concentrating, and learning, as well as shortness of breath. An interesting recent pattern emerged on these measures:
  • All the items end up significantly higher than where they started, but for many of them most of the increase happens over the first half of the time period in question. From the late 1990s or so until 2013, many of the items bounce around a bit but ultimately remain flat, or flat-ish.
  • drugs — Prozac and Lexapro, among others — have been prescribed to millions of people who experience these symptoms, many of whom presumably saw some improvement once the drugs kicked in, so this explanation at least makes intuitive sens
  • there are likely other factors leading to the plateau as well, said Twenge. For one thing, the “crime rate is lower [today] than it was when it peaked in the early 1990s,” and dealing with crime can lead to anxiety and depression symptoms. Other indicators of youth well-being, like teen pregnancy, were also significantly higher back then, and could have accounted for the trajectory visible on the graphs.“For whatever reason,” said Twenge, “if you look at what was going on back then, the early 1990s were not a good time, particularly for young people.”
  • “Obviously there’s a lot of good things about societal and technological progress,” she said, “and in a lot of ways our lives are much easier than, say, our grandparents’ or great-grandparents’ lives. But there’s a paradox here that we seem to have so much ease and relative economic prosperity compared to previous centuries, yet there’s this dissatisfaction, there’s this unhappiness, there are these mental health issues in terms of depression and anxiety.
  • She thinks the primary problem is that “modern life doesn’t give us as many opportunities to spend time with people and connect with them, at least in person, compared to, say, 80 years ago or 100 years ago. Families are smaller, the divorce rate is higher, people get married much later in life.”
  • it may simply be the case that many people who lived in less equal, more “traditional” times were forced into close companionship with a lot of other people, and that this shielded them from certain psychological problems, whatever else was going on in their lives.
  • She was virtually never alone — and that can be a bad thing, clearly, but from a mental health perspective being surrounded by people is a good thing.”
  • the shift away from this sort of life has also brought with it a shift in values, and Twenge thinks that this, too, can account for the increase in anxiety and depression. “There’s clear evidence that the focus on money, fame, and image has gone up,
  • “and there’s also clear evidence that people who focus on money, fame, and image are more likely to be depressed and anxious.”
  • “It’s so tempting to say the world is going to hell in a handbasket and everything’s bad, but there are so many good things about modern life,” she said. So maybe the key message here is that while there’s no way to go back to family farms and young marriage and parenthood — and, from an equality standpoint,we wouldn’t want to anyway — modern life needs to do a better job of connecting people to one another, and encouraging them to adopt the sorts of goals and outlooks that will make them happy.
Javier E

The Psychopath Makeover - The Chronicle Review - The Chronicle of Higher Education - 0 views

  • The eminent criminal psychologist and creator of the widely used Psychopathy Checklist paused before answering. "I think, in general, yes, society is becoming more psychopathic," he said. "I mean, there's stuff going on nowadays that we wouldn't have seen 20, even 10 years ago. Kids are becoming anesthetized to normal sexual behavior by early exposure to pornography on the Internet. Rent-a-friend sites are getting more popular on the Web, because folks are either too busy or too techy to make real ones. ... The recent hike in female criminality is particularly revealing. And don't even get me started on Wall Street."
  • in a survey that has so far tested 14,000 volunteers, Sara Konrath and her team at the University of Michigan's Institute for Social Research has found that college students' self-reported empathy levels (as measured by the Interpersonal Reactivity Index, a standardized questionnaire containing such items as "I often have tender, concerned feelings for people less fortunate than me" and "I try to look at everybody's side of a disagreement before I make a decision") have been in steady decline over the past three decades—since the inauguration of the scale, in fact, back in 1979. A particularly pronounced slump has been observed over the past 10 years. "College kids today are about 40 percent lower in empathy than their counterparts of 20 or 30 years ago," Konrath reports.
  • Imagining, it would seem, really does make it so. Whenever we read a story, our level of engagement is such that we "mentally simulate each new situation encountered in a narrative," according to one of the researchers, Nicole Speer. Our brains then interweave these newly encountered situations with knowledge and experience gleaned from our own lives to create an organic mosaic of dynamic mental syntheses.
  • ...16 more annotations...
  • during this same period, students' self-reported narcissism levels have shot through the roof. "Many people see the current group of college students, sometimes called 'Generation Me,' " Konrath continues, "as one of the most self-centered, narcissistic, competitive, confident, and individualistic in recent history."
  • Reading a book carves brand-new neural pathways into the ancient cortical bedrock of our brains. It transforms the way we see the world—makes us, as Nicholas Carr puts it in his recent essay, "The Dreams of Readers," "more alert to the inner lives of others." We become vampires without being bitten—in other words, more empathic. Books make us see in a way that casual immersion in the Internet, and the quicksilver virtual world it offers, doesn't.
  • if society really is becoming more psychopathic, it's not all doom and gloom. In the right context, certain psychopathic characteristics can actually be very constructive. A neurosurgeon I spoke with (who rated high on the psychopathic spectrum) described the mind-set he enters before taking on a difficult operation as "an intoxication that sharpens rather than dulls the senses." In fact, in any kind of crisis, the most effective individuals are often those who stay calm—who are able to respond to the exigencies of the moment while at the same time maintaining the requisite degree of detachment.
  • mental toughness isn't the only characteristic that Special Forces soldiers have in common with psychopaths. There's also fearlessness.
  • I ask Andy whether he ever felt any regret over anything he'd done. Over the lives he'd taken on his numerous secret missions around the world. "No," he replies matter-of-factly, his arctic-blue eyes showing not the slightest trace of emotion. "You seriously don't think twice about it. When you're in a hostile situation, the primary objective is to pull the trigger before the other guy pulls the trigger. And when you pull it, you move on. Simple as that. Why stand there, dwelling on what you've done? Go down that route and chances are the last thing that goes through your head will be a bullet from an M16. "The regiment's motto is 'Who Dares Wins.' But sometimes it can be shortened to 'F--- It.' "
  • one of the things that we know about psychopaths is that the light switches of their brains aren't wired up in quite the same way as the rest of ours are—and that one area particularly affected is the amygdala, a peanut-size structure located right at the center of the circuit board. The amygdala is the brain's emotion-control tower. It polices our emotional airspace and is responsible for the way we feel about things. But in psychopaths, a section of this airspace, the part that corresponds to fear, is empty.
  • Turn down the signals to the amygdala, of course, and you're well on the way to giving someone a psychopath makeover. Indeed, Liane Young and her team in Boston have since kicked things up a notch and demonstrated that applying TMS to the right temporoparietal junction—a neural ZIP code within that neighborhood—has significant effects not just on lying ability but also on moral-reasoning ability: in particular, ascribing intentionality to others' actions.
  • at an undisclosed moment sometime within the next 60 seconds, the image you see at the present time will change, and images of a different nature will appear on the screen. These images will be violent. And nauseating. And of a graphic and disturbing nature. "As you view these images, changes in your heart rate, skin conductance, and EEG activity will be monitored and compared with the resting levels that are currently being recorded
  • "OK," says Nick. "Let's get the show on the road." He disappears behind us, leaving Andy and me merrily soaking up the incontinence ad. Results reveal later that, at this point, as we wait for something to happen, our physiological output readings are actually pretty similar. Our pulse rates are significantly higher than our normal resting levels, in anticipation of what's to come. But with the change of scene, an override switch flips somewhere in Andy's brain. And the ice-cold Special Forces soldier suddenly swings into action. As vivid, florid images of dismemberment, mutilation, torture, and execution flash up on the screen in front of us (so vivid, in fact, that Andy later confesses to actually being able to "smell" the blood: a "kind of sickly-sweet smell that you never, ever forget"), accompanied not by the ambient spa music of before but by blaring sirens and hissing white noise, his physiological readings start slipping into reverse. His pulse rate begins to slow. His GSR begins to drop, his EEG to quickly and dramatically attenuate. In fact, by the time the show is over, all three of Andy's physiological output measures are pooling below his baseline.
  • Nick has seen nothing like it. "It's almost as if he was gearing himself up for the challenge," he says. "And then, when the challenge eventually presented itself, his brain suddenly responded by injecting liquid nitrogen into his veins. Suddenly implemented a blanket neural cull of all surplus feral emotion. Suddenly locked down into a hypnotically deep code red of extreme and ruthless focus." He shakes his head, nonplused. "If I hadn't recorded those readings myself, I'm not sure I would have believed them," he continues. "OK, I've never tested Special Forces before. And maybe you'd expect a slight attenuation in response. But this guy was in total and utter control of the situation. So tuned in, it looked like he'd completely tuned out."
  • My physiological output readings, in contrast, went through the roof. Exactly like Andy's, they were well above baseline as I'd waited for the carnage to commence. But that's where the similarity ended. Rather than go down in the heat of battle, in the midst of the blood and guts, mine had appreciated exponentially. "At least it shows that the equipment is working properly," comments Nick. "And that you're a normal human being."
  • TMS can't penetrate far enough into the brain to reach the emotion and moral-reasoning precincts directly. But by damping down or turning up the regions of the cerebral cortex that have links with such areas, it can simulate the effects of deeper, more incursive influence.
  • Before the experiment, I'd been curious about the time scale: how long it would take me to begin to feel the rush. Now I had the answer: about 10 to 15 minutes. The same amount of time, I guess, that it would take most people to get a buzz out of a beer or a glass of wine.
  • The effects aren't entirely dissimilar. An easy, airy confidence. A transcendental loosening of inhibition. The inchoate stirrings of a subjective moral swagger: the encroaching, and somehow strangely spiritual, realization that hell, who gives a s---, anyway? There is, however, one notable exception. One glaring, unmistakable difference between this and the effects of alcohol. That's the lack of attendant sluggishness. The enhancement of attentional acuity and sharpness. An insuperable feeling of heightened, polished awareness. Sure, my conscience certainly feels like it's on ice, and my anxieties drowned with a half-dozen shots of transcranial magnetic Jack Daniel's. But, at the same time, my whole way of being feels as if it's been sumptuously spring-cleaned with light. My soul, or whatever you want to call it, immersed in a spiritual dishwasher.
  • So this, I think to myself, is how it feels to be a psychopath. To cruise through life knowing that no matter what you say or do, guilt, remorse, shame, pity, fear—all those familiar, everyday warning signals that might normally light up on your psychological dashboard—no longer trouble you.
  • I suddenly get a flash of insight. We talk about gender. We talk about class. We talk about color. And intelligence. And creed. But the most fundamental difference between one individual and another must surely be that of the presence, or absence, of conscience. Conscience is what hurts when everything else feels good. But what if it's as tough as old boots? What if one's conscience has an infinite, unlimited pain threshold and doesn't bat an eye when others are screaming in agony?
Javier E

Older Really Can Mean Wiser - NYTimes.com - 0 views

  • mental faculties that improve with age.
  • Knowledge is a large part of the equation, of course. People who are middle-aged and older tend to know more than young adults, by virtue of having been around longer, and score higher on vocabulary tests, crossword puzzles and other measures of so-called crystallized intelligence.
  • the older brain offers something more, according to a new paper in the journal Psychological Science. Elements of social judgment and short-term memory, important pieces of the cognitive puzzle, may peak later in life than previously thought.
  • ...15 more annotations...
  • The researchers found that the broad split in age-related cognition — fluid in the young, crystallized in the old — masked several important nuances.
  • A year ago, German scientists argued that cognitive “deficits” in aging were caused largely by the accumulation of knowledge — that is, the brain slows down because it has to search a larger mental library of facts
  • Experts said the new analysis raised a different question: Are there distinct, independent elements of memory and cognition that peak at varying times of life?
  • The strength of the new analysis is partly in its data. The study evaluated historic scores from the popular Wechsler intelligence test, and compared them with more recent results from tens of thousands of people who took short cognitive tests on the authors’ websites, testmybrain.org and gameswithwords.org
  • The one drawback of this approach is that, because it didn’t follow the same people over a lifetime, it might have missed the effect of different cultural experiences
  • most previous studies have not been nearly as large, or had such a range of ages. Participants on the websites were 10 to 89 years old, and they took a large battery of tests, measuring skills like memory for abstract symbols and strings of digits, problem solving, and facility reading emotions from strangers’ eyes.
  • “We found different abilities really maturing or ripening at different ages,” Dr. Germine said. “It’s a much richer picture of the life span than just calling it aging.”
  • At least as important, the researchers looked at the effect of age on each type of test.
  • Processing speed — the quickness with which someone can manipulate digits, words or images, as if on a mental sketch board — generally peaks in the late teens
  • memory for some things, like names, does so in the early 20s
  • But the capacity of that sketch board, called working memory, peaks at least a decade later and is slow to decline. In particular, the ability to recall faces and do some mental manipulation of numbers peaked about age 30,
  • The researchers also analyzed results from the Reading the Mind in the Eyes test. The test involves looking at snapshots of strangers’ eyes on a computer screen and determining their moods from a menu of options like “tentative,” “uncertain” and “skeptical.”
  • people in their 40s or 50s consistently did the best, the study found, and the skill declined very slowly later in life
  • The picture that emerges from these findings is of an older brain that moves more slowly than its younger self, but is just as accurate in many areas and more adept at reading others’ moods — on top of being more knowledgeable. That’s a handy combination, given that so many important decisions people make intimately affects others.
  • for now, the new research at least gives some meaning to the empty adjective “wily.”
caelengrubb

Why Is Memory So Good and So Bad? - Scientific American - 0 views

  • Memories of visual images (e.g., dinner plates) are stored in what is called visual memory.
  • Our minds use visual memory to perform even the simplest of computations; from remembering the face of someone we’ve just met, to remembering what time it was last we checked. Without visual memory, we wouldn’t be able to store—and later retrieve—anything we see.
  • ust as a computer’s memory capacity constrains its abilities, visual memory capacity has been correlated with a number of higher cognitive abilities, including academic success, fluid intelligence (the ability to solve novel problems), and general comprehension.
  • ...13 more annotations...
  • For many reasons, then, it would be very useful to understand how visual memory facilitates these mental operations, as well as constrains our ability to perform them
  • Visual working memory is where visual images are temporarily stored while your mind works away at other tasks—like a whiteboard on which things are briefly written and then wiped away. We rely on visual working memory when remembering things over brief intervals, such as when copying lecture notes to a notebook.
  • UC Davis psychologists Weiwei Zhang and Steven Luck have shed some light on this problem. In their experiment, participants briefly saw three colored squares flashed on a computer screen, and were asked to remember the colors of each square. Then, after 1, 4 or 10 seconds the squares re-appeared, except this time their colors were missing, so that all that was visible were black squares outlined in white.
  • The participants had a simple task: to recall the color of one particular square, not knowing in advance which square they would be asked to recall. The psychologists assumed that measuring how visual working memory behaves over increasing demands (i.e., the increasing durations of 1,4 or 10 seconds) would reveal something about how the system works.
  • If short-term visual memories fade away—if they are gradually wiped away from the whiteboard—then after longer intervals participants’ accuracy in remembering the colors should still be high, deviating only slightly from the square’s original color. But if these memories are wiped out all at once—if the whiteboard is left untouched until, all at once, scrubbed clean—then participants should make very precise responses (corresponding to instances when the memories are still untouched) and then, after the interval grows too long, very random guesses.
  • Which is exactly what happened: Zhang & Luck found that participants were either very precise, or they completely guessed; that is, they either remembered the square’s color with great accuracy, or forgot it completely
  • But this, it turns out, is not true of all memories
  • In a recent paper, Researchers at MIT and Harvard found that, if a memory can survive long enough to make it into what is called “visual long-term memory,” then it doesn’t have to be wiped out at all.
  • Talia Konkle and colleagues showed participants a stream of three thousand images of different scenes, such as ocean waves, golf courses or amusement parks. Then, participants were shown two hundred pairs of images—an old one they had seen in the first task, and a completely new one—and asked to indicate which was the old one.
  • Participants were remarkably accurate at spotting differences between the new and old images—96 percent
  • In a recent review, researchers at Harvard and MIT argue that the critical factor is how meaningful the remembered images are—whether the content of the images you see connects to pre-existing knowledge about them
  • This prior knowledge changes how these images are processed, allowing thousands of them to be transferred from the whiteboard of short-term memory into the bank vault of long-term memory, where they are stored with remarkable detail.
  • Together, these experiments suggest why memories are not eliminated equally— indeed, some don’t seem to be eliminated at all. This might also explain why we’re so hopeless at remembering some things, and yet so awesome at remembering others.
sandrine_h

Can You Get Smarter? - The New York Times - 0 views

  • Can you get smarter by exercising — or altering — your brain?
  • Americans are a captive market for anything, from supposed smart drugs and supplements to brain training, that promises to boost normal mental functioning or to stem its all-too-common decline.
  • Our brain has remarkable neuroplasticity; that is, it can remodel and change itself in response to various experiences and injuries. So can it be trained to enhance its own cognitive prowess?The multibillion-dollar brain training industry certainly thinks so and claims that you can increase your memory, attention and reasoning just by playing various mental games.
  • ...4 more annotations...
  • Although improvements were observed in every cognitive task that was practiced, there was no evidence that brain training made people smarter.
  • we can clearly enhance learning, even if mental gymnastics won’t make us smarter.
  • Adderall enhanced performance on one of the tests, the embedded image test, which requires subjects to reassemble a whole image from a scrambled one.Still, these are subtle effects, and there is no evidence that any prescription drug or supplement or smart drink is going to raise your I.Q.
  • In the end, you can’t yet exceed your innate intelligence
Javier E

A New Kind of Tutoring Aims to Make Students Smarter - NYTimes.com - 1 views

  • the goal is to improve cognitive skills. LearningRx is one of a growing number of such commercial services — some online, others offered by psychologists. Unlike traditional tutoring services that seek to help students master a subject, brain training purports to enhance comprehension and the ability to analyze and mentally manipulate concepts, images, sounds and instructions. In a word, it seeks to make students smarter.
  • “The average gain on I.Q. is 15 points after 24 weeks of training, and 20 points in less than 32 weeks.”
  • , “Our users have reported profound benefits that include: clearer and quicker thinking; faster problem-solving skills; increased alertness and awareness; better concentration at work or while driving; sharper memory for names, numbers and directions.”
  • ...8 more annotations...
  • “It used to take me an hour to memorize 20 words. Now I can learn, like, 40 new words in 20 minutes.”
  • “I don’t know if it makes you smarter. But when you get to each new level on the math and reading tasks, it definitely builds up your self-confidence.”
  • . “What you care about is not an intelligence test score, but whether your ability to do an important task has really improved. That’s a chain of evidence that would be really great to have. I haven’t seen it.”
  • Still,a new and growing body of scientific evidence indicates that cognitive training can be effective, including that offered by commercial services.
  • He looked at 340 middle-school students who spent two hours a week for a semester using LearningRx exercises in their schools’ computer labs and an equal number of students who received no such training. Those who played the online games, Dr. Hill found, not only improved significantly on measures of cognitive abilities compared to their peers, but also on Virginia’s annual Standards of Learning exam.
  • I’ve had some kids who not only reported that they had very big changes in the classroom, but when we bring them back in the laboratory to do neuropsychological testing, we also see great changes. They show increases that would be highly unlikely to happen just by chance.”
  • where crosswords and Sudoku are intended to be a diversion, the games here give that same kind of reward, only they’re designed to improve your brain, your memory, your problem-solving skills.”
  • More than 40 games are offered by Lumosity. One, the N-back, is based on a task developed decades ago by psychologists. Created to test working memory, the N-back challenges users to keep track of a continuously updated list and remember which item appeared “n” times ago.
sissij

Training the brain to boost self-confidence - Medical News Today - 0 views

  • Self-confidence is generally defined as the belief in one's own abilities. As the University of Queensland in Australia put it, self-confidence describes "an internal state made up of what we think and feel about ourselves."
  • low self-confidence can also increase the risk of mental health problems, such as depression and bipolar disorder.
  • The researchers came to their findings through the use of a novel imaging technique known as "decoded neurofeedback." This involves brain scans to monitor complex brain activity patterns.
  •  
    Scientists use patterns to in the experiment to make hypothesis. I find it interesting that although correlation does not mean causation, it is still very useful for inductive reasoning. This article also talks about how confidence can affect ourselves, and how we can affect out confidence. The definition of confidence here states that confidence is our belief in ourselves. Why do we need confidence? Why do we need an internal statement to reassure us that our decision is right? --Sissi (12/22/2016)
Javier E

A 'Philosophy' of Plastic Surgery in Brazil - NYTimes.com - 0 views

  • Is beauty a right, which, like education or health care, should be realized with the help of public institutions and expertise?
  • For years he has performed charity surgeries for the poor. More radically, some of his students offer free cosmetic operations in the nation’s public health system.
  • I asked her why she wanted to have the surgery.  “I didn’t put in an implant to exhibit myself, but to feel better. It wasn’t a simple vanity, but a  . . . necessary vanity.  Surgery improves a woman’s auto-estima.”
  • ...11 more annotations...
  • He argues that the real object of healing is not the body, but the mind.  A plastic surgeon is a “psychologist with a scalpel in his hand.” This idea led Pitanguy to argue for the “union” of cosmetic and reconstructive procedures.  In both types of surgery beauty and mental healing subtly mingle, he claims, and both benefit health.
  • “What is the difference between a plastic surgeon and a psychoanalyst?  The psychoanalyst knows everything but changes nothing.  The plastic surgeon knows nothing but changes everything.”
  • Plastic surgery gained legitimacy in the early 20th century by limiting itself to reconstructive operations.  The “beauty doctor” was a term of derision.  But as techniques improved they were used for cosmetic improvements.  Missing, however, was a valid diagnosis. Concepts like psychoanalyst Alfred Adler’s inferiority complex — and later low self-esteem — provided a missing link.
  • Victorians saw a cleft palate as a defect that built character. For us it hinders self-realization and merits corrective surgery.  This shift reflects a new attitude towards appearance and mental health: the notion that at least some defects cause unfair suffering and social stigma is now widely accepted. But Brazilian surgeons take this reasoning a step further.  Cosmetic surgery is a consumer service in most of the world.  In Brazil it is becoming, as Ester put it, a “necessary vanity.”
  • Pitanguy, whose patients often have mixed African, indigenous and European ancestry, stresses that aesthetic ideals vary by epoch and ethnicity.  What matters are not objective notions of beauty, but how the patient feels.  As his colleague says, the job of the plastic surgeon is to simply “follow desires.”
  • Patients are on average younger than they were 20 years ago.  They often request minor changes to become, as one surgeon said, “more perfect.”
  • The growth of plastic surgery thus reflects a new way of working not only on the suffering mind, but also on the erotic body.  Unlike fashion’s embrace of playful dissimulation and seduction, this beauty practice instead insists on correcting precisely measured flaws.  Plastic surgery may contribute to a biologized view of sex where pleasure and fantasy matter less than the anatomical “truth” of the bare body.
  • It is not coincidental that Brazil has not only high rates of plastic surgery, but also Cesarean sections (70 percent of deliveries in some private hospitals), tubal ligations,  and other surgeries for women. Some women see elective surgeries as part of a modern standard of care, more or less routine for the middle class, but only sporadically available to the poor.
  • When a good life is defined through the ability to buy goods then rights may be reinterpreted to mean not equality before the law, but equality in the market. 
  • Beauty is unfair: the attractive enjoy privileges and powers gained without merit.  As such it can offend egalitarian values.  Yet while attractiveness is a quality “awarded” to those who don’t morally deserve it, it can also grant power to those excluded from other systems of privilege.  It is a kind of “double negative”: a form of power that is unfairly distributed but which can disturb other unfair hierarchies.  For this reason it may have democratic appeal.  In poor urban areas beauty often has a similar importance for girls as soccer (or basketball) does for boys: it promises an almost magical attainment of recognition, wealth or power.
  • For many consumers attractiveness is essential to economic and sexual competition, social visibility, and mental well being.  This “value” of appearance may be especially clear for those excluded from other means of social ascent.  For the poor beauty is often a form of capital that can be exchanged for other benefits, however small, transient, or unconducive to collective change.
Javier E

ROUGH TYPE | Nicholas Carr's blog - 0 views

  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • ...39 more annotations...
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Social skills and relationships seem to suffer as well.
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
Javier E

Thieves of experience: On the rise of surveillance capitalism - 1 views

  • Harvard Business School professor emerita Shoshana Zuboff argues in her new book that the Valley’s wealth and power are predicated on an insidious, essentially pathological form of private enterprise—what she calls “surveillance capitalism.” Pioneered by Google, perfected by Facebook, and now spreading throughout the economy, surveillance capitalism uses human life as its raw material. Our everyday experiences, distilled into data, have become a privately-owned business asset used to predict and mold our behavior, whether we’re shopping or socializing, working or voting.
  • By reengineering the economy and society to their own benefit, Google and Facebook are perverting capitalism in a way that undermines personal freedom and corrodes democracy.
  • Under the Fordist model of mass production and consumption that prevailed for much of the twentieth century, industrial capitalism achieved a relatively benign balance among the contending interests of business owners, workers, and consumers. Enlightened executives understood that good pay and decent working conditions would ensure a prosperous middle class eager to buy the goods and services their companies produced. It was the product itself — made by workers, sold by companies, bought by consumers — that tied the interests of capitalism’s participants together. Economic and social equilibrium was negotiated through the product.
  • ...72 more annotations...
  • By removing the tangible product from the center of commerce, surveillance capitalism upsets the equilibrium. Whenever we use free apps and online services, it’s often said, we become the products, our attention harvested and sold to advertisers
  • this truism gets it wrong. Surveillance capitalism’s real products, vaporous but immensely valuable, are predictions about our future behavior — what we’ll look at, where we’ll go, what we’ll buy, what opinions we’ll hold — that internet companies derive from our personal data and sell to businesses, political operatives, and other bidders.
  • Unlike financial derivatives, which they in some ways resemble, these new data derivatives draw their value, parasite-like, from human experience.To the Googles and Facebooks of the world, we are neither the customer nor the product. We are the source of what Silicon Valley technologists call “data exhaust” — the informational byproducts of online activity that become the inputs to prediction algorithms
  • Another 2015 study, appearing in the Journal of Computer-Mediated Communication, showed that when people hear their phone ring but are unable to answer it, their blood pressure spikes, their pulse quickens, and their problem-solving skills decline.
  • The smartphone has become a repository of the self, recording and dispensing the words, sounds and images that define what we think, what we experience and who we are. In a 2015 Gallup survey, more than half of iPhone owners said that they couldn’t imagine life without the device.
  • So what happens to our minds when we allow a single tool such dominion over our perception and cognition?
  • Not only do our phones shape our thoughts in deep and complicated ways, but the effects persist even when we aren’t using the devices. As the brain grows dependent on the technology, the research suggests, the intellect weakens.
  • he has seen mounting evidence that using a smartphone, or even hearing one ring or vibrate, produces a welter of distractions that makes it harder to concentrate on a difficult problem or job. The division of attention impedes reasoning and performance.
  • internet companies operate in what Zuboff terms “extreme structural independence from people.” When databases displace goods as the engine of the economy, our own interests, as consumers but also as citizens, cease to be part of the negotiation. We are no longer one of the forces guiding the market’s invisible hand. We are the objects of surveillance and control.
  • Social skills and relationships seem to suffer as well.
  • In both tests, the subjects whose phones were in view posted the worst scores, while those who left their phones in a different room did the best. The students who kept their phones in their pockets or bags came out in the middle. As the phone’s proximity increased, brainpower decreased.
  • In subsequent interviews, nearly all the participants said that their phones hadn’t been a distraction—that they hadn’t even thought about the devices during the experiment. They remained oblivious even as the phones disrupted their focus and thinking.
  • The researchers recruited 520 undergraduates at UCSD and gave them two standard tests of intellectual acuity. One test gauged “available working-memory capacity,” a measure of how fully a person’s mind can focus on a particular task. The second assessed “fluid intelligence,” a person’s ability to interpret and solve an unfamiliar problem. The only variable in the experiment was the location of the subjects’ smartphones. Some of the students were asked to place their phones in front of them on their desks; others were told to stow their phones in their pockets or handbags; still others were required to leave their phones in a different room.
  • the “integration of smartphones into daily life” appears to cause a “brain drain” that can diminish such vital mental skills as “learning, logical reasoning, abstract thought, problem solving, and creativity.”
  •  Smartphones have become so entangled with our existence that, even when we’re not peering or pawing at them, they tug at our attention, diverting precious cognitive resources. Just suppressing the desire to check our phone, which we do routinely and subconsciously throughout the day, can debilitate our thinking.
  • They found that students who didn’t bring their phones to the classroom scored a full letter-grade higher on a test of the material presented than those who brought their phones. It didn’t matter whether the students who had their phones used them or not: All of them scored equally poorly.
  • A study of nearly a hundred secondary schools in the U.K., published last year in the journal Labour Economics, found that when schools ban smartphones, students’ examination scores go up substantially, with the weakest students benefiting the most.
  • Data, the novelist and critic Cynthia Ozick once wrote, is “memory without history.” Her observation points to the problem with allowing smartphones to commandeer our brains
  • Because smartphones serve as constant reminders of all the friends we could be chatting with electronically, they pull at our minds when we’re talking with people in person, leaving our conversations shallower and less satisfying.
  • In a 2013 study conducted at the University of Essex in England, 142 participants were divided into pairs and asked to converse in private for ten minutes. Half talked with a phone in the room, half without a phone present. The subjects were then given tests of affinity, trust and empathy. “The mere presence of mobile phones,” the researchers reported in the Journal of Social and Personal Relationships, “inhibited the development of interpersonal closeness and trust” and diminished “the extent to which individuals felt empathy and understanding from their partners.”
  • The evidence that our phones can get inside our heads so forcefully is unsettling. It suggests that our thoughts and feelings, far from being sequestered in our skulls, can be skewed by external forces we’re not even aware o
  •  Scientists have long known that the brain is a monitoring system as well as a thinking system. Its attention is drawn toward any object that is new, intriguing or otherwise striking — that has, in the psychological jargon, “salience.”
  • even in the history of captivating media, the smartphone stands out. It is an attention magnet unlike any our minds have had to grapple with before. Because the phone is packed with so many forms of information and so many useful and entertaining functions, it acts as what Dr. Ward calls a “supernormal stimulus,” one that can “hijack” attention whenever it is part of our surroundings — and it is always part of our surroundings.
  • Imagine combining a mailbox, a newspaper, a TV, a radio, a photo album, a public library and a boisterous party attended by everyone you know, and then compressing them all into a single, small, radiant object. That is what a smartphone represents to us. No wonder we can’t take our minds off it.
  • The irony of the smartphone is that the qualities that make it so appealing to us — its constant connection to the net, its multiplicity of apps, its responsiveness, its portability — are the very ones that give it such sway over our minds.
  • Phone makers like Apple and Samsung and app writers like Facebook, Google and Snap design their products to consume as much of our attention as possible during every one of our waking hours
  • Social media apps were designed to exploit “a vulnerability in human psychology,” former Facebook president Sean Parker said in a recent interview. “[We] understood this consciously. And we did it anyway.”
  • A quarter-century ago, when we first started going online, we took it on faith that the web would make us smarter: More information would breed sharper thinking. We now know it’s not that simple.
  • As strange as it might seem, people’s knowledge and understanding may actually dwindle as gadgets grant them easier access to online data stores
  • In a seminal 2011 study published in Science, a team of researchers — led by the Columbia University psychologist Betsy Sparrow and including the late Harvard memory expert Daniel Wegner — had a group of volunteers read forty brief, factual statements (such as “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003”) and then type the statements into a computer. Half the people were told that the machine would save what they typed; half were told that the statements would be erased.
  • Afterward, the researchers asked the subjects to write down as many of the statements as they could remember. Those who believed that the facts had been recorded in the computer demonstrated much weaker recall than those who assumed the facts wouldn’t be stored. Anticipating that information would be readily available in digital form seemed to reduce the mental effort that people made to remember it
  • The researchers dubbed this phenomenon the “Google effect” and noted its broad implications: “Because search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
  • as the pioneering psychologist and philosopher William James said in an 1892 lecture, “the art of remembering is the art of thinking.”
  • Only by encoding information in our biological memory can we weave the rich intellectual associations that form the essence of personal knowledge and give rise to critical and conceptual thinking. No matter how much information swirls around us, the less well-stocked our memory, the less we have to think with.
  • As Dr. Wegner and Dr. Ward explained in a 2013 Scientific American article, when people call up information through their devices, they often end up suffering from delusions of intelligence. They feel as though “their own mental capacities” had generated the information, not their devices. “The advent of the ‘information age’ seems to have created a generation of people who feel they know more than ever before,” the scholars concluded, even though “they may know ever less about the world around them.”
  • That insight sheds light on society’s current gullibility crisis, in which people are all too quick to credit lies and half-truths spread through social media. If your phone has sapped your powers of discernment, you’ll believe anything it tells you.
  • A second experiment conducted by the researchers produced similar results, while also revealing that the more heavily students relied on their phones in their everyday lives, the greater the cognitive penalty they suffered.
  • When we constrict our capacity for reasoning and recall or transfer those skills to a gadget, we sacrifice our ability to turn information into knowledge. We get the data but lose the meaning
  • We need to give our minds more room to think. And that means putting some distance between ourselves and our phones.
  • Google’s once-patient investors grew restive, demanding that the founders figure out a way to make money, preferably lots of it.
  • nder pressure, Page and Brin authorized the launch of an auction system for selling advertisements tied to search queries. The system was designed so that the company would get paid by an advertiser only when a user clicked on an ad. This feature gave Google a huge financial incentive to make accurate predictions about how users would respond to ads and other online content. Even tiny increases in click rates would bring big gains in income. And so the company began deploying its stores of behavioral data not for the benefit of users but to aid advertisers — and to juice its own profits. Surveillance capitalism had arrived.
  • Google’s business now hinged on what Zuboff calls “the extraction imperative.” To improve its predictions, it had to mine as much information as possible from web users. It aggressively expanded its online services to widen the scope of its surveillance.
  • Through Gmail, it secured access to the contents of people’s emails and address books. Through Google Maps, it gained a bead on people’s whereabouts and movements. Through Google Calendar, it learned what people were doing at different moments during the day and whom they were doing it with. Through Google News, it got a readout of people’s interests and political leanings. Through Google Shopping, it opened a window onto people’s wish lists,
  • The company gave all these services away for free to ensure they’d be used by as many people as possible. It knew the money lay in the data.
  • the organization grew insular and secretive. Seeking to keep the true nature of its work from the public, it adopted what its CEO at the time, Eric Schmidt, called a “hiding strategy” — a kind of corporate omerta backed up by stringent nondisclosure agreements.
  • Page and Brin further shielded themselves from outside oversight by establishing a stock structure that guaranteed their power could never be challenged, neither by investors nor by directors.
  • What’s most remarkable about the birth of surveillance capitalism is the speed and audacity with which Google overturned social conventions and norms about data and privacy. Without permission, without compensation, and with little in the way of resistance, the company seized and declared ownership over everyone’s information
  • The companies that followed Google presumed that they too had an unfettered right to collect, parse, and sell personal data in pretty much any way they pleased. In the smart homes being built today, it’s understood that any and all data will be beamed up to corporate clouds.
  • Google conducted its great data heist under the cover of novelty. The web was an exciting frontier — something new in the world — and few people understood or cared about what they were revealing as they searched and surfed. In those innocent days, data was there for the taking, and Google took it
  • Google also benefited from decisions made by lawmakers, regulators, and judges — decisions that granted internet companies free use of a vast taxpayer-funded communication infrastructure, relieved them of legal and ethical responsibility for the information and messages they distributed, and gave them carte blanche to collect and exploit user data.
  • Consider the terms-of-service agreements that govern the division of rights and the delegation of ownership online. Non-negotiable, subject to emendation and extension at the company’s whim, and requiring only a casual click to bind the user, TOS agreements are parodies of contracts, yet they have been granted legal legitimacy by the court
  • Law professors, writes Zuboff, “call these ‘contracts of adhesion’ because they impose take-it-or-leave-it conditions on users that stick to them whether they like it or not.” Fundamentally undemocratic, the ubiquitous agreements helped Google and other firms commandeer personal data as if by fiat.
  • n the choices we make as consumers and private citizens, we have always traded some of our autonomy to gain other rewards. Many people, it seems clear, experience surveillance capitalism less as a prison, where their agency is restricted in a noxious way, than as an all-inclusive resort, where their agency is restricted in a pleasing way
  • Zuboff makes a convincing case that this is a short-sighted and dangerous view — that the bargain we’ve struck with the internet giants is a Faustian one
  • but her case would have been stronger still had she more fully addressed the benefits side of the ledger.
  • there’s a piece missing. While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on
  • hat the industries of the future will seek to manufacture is the self.
  • Behavior modification is the thread that ties today’s search engines, social networks, and smartphone trackers to tomorrow’s facial-recognition systems, emotion-detection sensors, and artificial-intelligence bots.
  • All of Facebook’s information wrangling and algorithmic fine-tuning, she writes, “is aimed at solving one problem: how and when to intervene in the state of play that is your daily life in order to modify your behavior and thus sharply increase the predictability of your actions now, soon, and later.”
  • “The goal of everything we do is to change people’s actual behavior at scale,” a top Silicon Valley data scientist told her in an interview. “We can test how actionable our cues are for them and how profitable certain behaviors are for us.”
  • This goal, she suggests, is not limited to Facebook. It is coming to guide much of the economy, as financial and social power shifts to the surveillance capitalists
  • Combining rich information on individuals’ behavioral triggers with the ability to deliver precisely tailored and timed messages turns out to be a recipe for behavior modification on an unprecedented scale.
  • it was Facebook, with its incredibly detailed data on people’s social lives, that grasped digital media’s full potential for behavior modification. By using what it called its “social graph” to map the intentions, desires, and interactions of literally billions of individuals, it saw that it could turn its network into a worldwide Skinner box, employing psychological triggers and rewards to program not only what people see but how they react.
  • spying on the populace is not the end game. The real prize lies in figuring out ways to use the data to shape how people think and act. “The best way to predict the future is to invent it,” the computer scientist Alan Kay once observed. And the best way to predict behavior is to script it.
  • competition for personal data intensified. It was no longer enough to monitor people online; making better predictions required that surveillance be extended into homes, stores, schools, workplaces, and the public squares of cities and towns. Much of the recent innovation in the tech industry has entailed the creation of products and services designed to vacuum up data from every corner of our lives
  • “The typical complaint is that privacy is eroded, but that is misleading,” Zuboff writes. “In the larger societal pattern, privacy is not eroded but redistributed . . . . Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism.” The transfer of decision rights is also a transfer of autonomy and agency, from the citizen to the corporation.
  • What we lose under this regime is something more fundamental than privacy. It’s the right to make our own decisions about privacy — to draw our own lines between those aspects of our lives we are comfortable sharing and those we are not
  • Other possible ways of organizing online markets, such as through paid subscriptions for apps and services, never even got a chance to be tested.
  • Online surveillance came to be viewed as normal and even necessary by politicians, government bureaucrats, and the general public
  • Google and other Silicon Valley companies benefited directly from the government’s new stress on digital surveillance. They earned millions through contracts to share their data collection and analysis techniques with the National Security Agenc
  • As much as the dot-com crash, the horrors of 9/11 set the stage for the rise of surveillance capitalism. Zuboff notes that, in 2000, members of the Federal Trade Commission, frustrated by internet companies’ lack of progress in adopting privacy protections, began formulating legislation to secure people’s control over their online information and severely restrict the companies’ ability to collect and store it. It seemed obvious to the regulators that ownership of personal data should by default lie in the hands of private citizens, not corporations.
  • The 9/11 attacks changed the calculus. The centralized collection and analysis of online data, on a vast scale, came to be seen as essential to national security. “The privacy provisions debated just months earlier vanished from the conversation more or less overnight,”
caelengrubb

Here's How Social Media Affects Your Mental Health | McLean Hospital - 0 views

  • The platforms are designed to be addictive and are associated with anxiety, depression, and even physical ailments.
  • According to the Pew Research Center, 69% of adults and 81% of teens in the U.S. use social media.
  • To boost self-esteem and feel a sense of belonging in their social circles, people post content with the hope of receiving positive feedback. Couple that content with the structure of potential future reward, and you get a recipe for constantly checking platforms.
  • ...8 more annotations...
  • FOMO—fear of missing out—also plays a role
  • A 2018 British study tied social media use to decreased, disrupted, and delayed sleep, which is associated with depression, memory loss, and poor academic performance.
  • Social media use can affect users’ physical health even more directly. Researchers know the connection between the mind and the gut can turn anxiety and depression into nausea, headaches, muscle tension, and tremors.
  • The earlier teens start using social media, the greater impact the platforms have on mental health. This is especially true for females. While teen males tend to express aggression physically, females do so relationally by excluding others and sharing hurtful comments. Social media increases the opportunity for such harmful interactions.
  • In the past, teens read magazines that contained altered photos of models. Now, these images are one thumb-scroll away at any given time. Apps that provide the user with airbrushing, teeth whitening, and more filters are easy to find and easier to use. It’s not only celebrities who look perfect—it’s everyone.
  • In recent years, plastic surgeons have seen an uptick in requests from patients who want to look like their filtered Snapchat and Instagram photos.
  • A person experiences impostor syndrome when feeling chronic self-doubt and a sense of being exposed as ‘a fraud’ in terms of success and intellect. “Whether it’s another pretty vacation or someone’s bouquet of flowers, my mind went from ‘Why not me?’ to ‘I don’t deserve those things, and I don’t know why,’ and it made me feel awful.”
  • Sperling encourages people to conduct their own behavior experiments by rating their emotions on a scale of 0-10, with 10 being the most intensely one could experience an emotion, before and after using social media sites at the same time each day for a week.
Javier E

AI is about to completely change how you use computers | Bill Gates - 0 views

  • Health care
  • Entertainment and shopping
  • Today, AI’s main role in healthcare is to help with administrative tasks. Abridge, Nuance DAX, and Nabla Copilot, for example, can capture audio during an appointment and then write up notes for the doctor to review.
  • ...38 more annotations...
  • agents will open up many more learning opportunities.
  • Already, AI can help you pick out a new TV and recommend movies, books, shows, and podcasts. Likewise, a company I’ve invested in, recently launched Pix, which lets you ask questions (“Which Robert Redford movies would I like and where can I watch them?”) and then makes recommendations based on what you’ve liked in the past
  • Productivity
  • copilots can do a lot—such as turn a written document into a slide deck, answer questions about a spreadsheet using natural language, and summarize email threads while representing each person’s point of view.
  • before the sophisticated agents I’m describing become a reality, we need to confront a number of questions about the technology and how we’ll use it.
  • Helping patients and healthcare workers will be especially beneficial for people in poor countries, where many never get to see a doctor at all.
  • To create a new app or service, you won’t need to know how to write code or do graphic design. You’ll just tell your agent what you want. It will be able to write the code, design the look and feel of the app, create a logo, and publish the app to an online store
  • Agents will do even more. Having one will be like having a person dedicated to helping you with various tasks and doing them independently if you want. If you have an idea for a business, an agent will help you write up a business plan, create a presentation for it, and even generate images of what your product might look like
  • For decades, I’ve been excited about all the ways that software would make teachers’ jobs easier and help students learn. It won’t replace teachers, but it will supplement their work—personalizing the work for students and liberating teachers from paperwork and other tasks so they can spend more time on the most important parts of the job.
  • Mental health care is another example of a service that agents will make available to virtually everyone. Today, weekly therapy sessions seem like a luxury. But there is a lot of unmet need, and many people who could benefit from therapy don’t have access to it.
  • I don’t think any single company will dominate the agents business--there will be many different AI engines available.
  • The real shift will come when agents can help patients do basic triage, get advice about how to deal with health problems, and decide whether they need to seek treatment.
  • They’ll replace word processors, spreadsheets, and other productivity apps.
  • Education
  • For example, few families can pay for a tutor who works one-on-one with a student to supplement their classroom work. If agents can capture what makes a tutor effective, they’ll unlock this supplemental instruction for everyone who wants it. If a tutoring agent knows that a kid likes Minecraft and Taylor Swift, it will use Minecraft to teach them about calculating the volume and area of shapes, and Taylor’s lyrics to teach them about storytelling and rhyme schemes. The experience will be far richer—with graphics and sound, for example—and more personalized than today’s text-based tutors.
  • your agent will be able to help you in the same way that personal assistants support executives today. If your friend just had surgery, your agent will offer to send flowers and be able to order them for you. If you tell it you’d like to catch up with your old college roommate, it will work with their agent to find a time to get together, and just before you arrive, it will remind you that their oldest child just started college at the local university.
  • To see the dramatic change that agents will bring, let’s compare them to the AI tools available today. Most of these are bots. They’re limited to one app and generally only step in when you write a particular word or ask for help. Because they don’t remember how you use them from one time to the next, they don’t get better or learn any of your preferences.
  • The current state of the art is Khanmigo, a text-based bot created by Khan Academy. It can tutor students in math, science, and the humanities—for example, it can explain the quadratic formula and create math problems to practice on. It can also help teachers do things like write lesson plans.
  • Businesses that are separate today—search advertising, social networking with advertising, shopping, productivity software—will become one business.
  • other issues won’t be decided by companies and governments. For example, agents could affect how we interact with friends and family. Today, you can show someone that you care about them by remembering details about their life—say, their birthday. But when they know your agent likely reminded you about it and took care of sending flowers, will it be as meaningful for them?
  • In the computing industry, we talk about platforms—the technologies that apps and services are built on. Android, iOS, and Windows are all platforms. Agents will be the next platform.
  • A shock wave in the tech industry
  • Agents won’t simply make recommendations; they’ll help you act on them. If you want to buy a camera, you’ll have your agent read all the reviews for you, summarize them, make a recommendation, and place an order for it once you’ve made a decision.
  • Agents will affect how we use software as well as how it’s written. They’ll replace search sites because they’ll be better at finding information and summarizing it for you
  • they’ll be dramatically better. You’ll be able to have nuanced conversations with them. They will be much more personalized, and they won’t be limited to relatively simple tasks like writing a letter.
  • Companies will be able to make agents available for their employees to consult directly and be part of every meeting so they can answer questions.
  • AI agents that are well trained in mental health will make therapy much more affordable and easier to get. Wysa and Youper are two of the early chatbots here. But agents will go much deeper. If you choose to share enough information with a mental health agent, it will understand your life history and your relationships. It’ll be available when you need it, and it will never get impatient. It could even, with your permission, monitor your physical responses to therapy through your smart watch—like if your heart starts to race when you’re talking about a problem with your boss—and suggest when you should see a human therapist.
  • If the number of companies that have started working on AI just this year is any indication, there will be an exceptional amount of competition, which will make agents very inexpensive.
  • Agents are smarter. They’re proactive—capable of making suggestions before you ask for them. They accomplish tasks across applications. They improve over time because they remember your activities and recognize intent and patterns in your behavior. Based on this information, they offer to provide what they think you need, although you will always make the final decisions.
  • Agents are not only going to change how everyone interacts with computers. They’re also going to upend the software industry, bringing about the biggest revolution in computing since we went from typing commands to tapping on icons.
  • The most exciting impact of AI agents is the way they will democratize services that today are too expensive for most people
  • The ramifications for the software business and for society will be profound.
  • In the next five years, this will change completely. You won’t have to use different apps for different tasks. You’ll simply tell your device, in everyday language, what you want to do. And depending on how much information you choose to share with it, the software will be able to respond personally because it will have a rich understanding of your life. In the near future, anyone who’s online will be able to have a personal assistant powered by artificial intelligence that’s far beyond today’s technology.
  • You’ll also be able to get news and entertainment that’s been tailored to your interests. CurioAI, which creates a custom podcast on any subject you ask about, is a glimpse of what’s coming.
  • An agent will be able to help you with all your activities if you want it to. With permission to follow your online interactions and real-world locations, it will develop a powerful understanding of the people, places, and activities you engage in. It will get your personal and work relationships, hobbies, preferences, and schedule. You’ll choose how and when it steps in to help with something or ask you to make a decision.
  • even the best sites have an incomplete understanding of your work, personal life, interests, and relationships and a limited ability to use this information to do things for you. That’s the kind of thing that is only possible today with another human being, like a close friend or personal assistant.
  • In the distant future, agents may even force humans to face profound questions about purpose. Imagine that agents become so good that everyone can have a high quality of life without working nearly as much. In a future like that, what would people do with their time? Would anyone still want to get an education when an agent has all the answers? Can you have a safe and thriving society when most people have a lot of free time on their hands?
  • They’ll have an especially big influence in four areas: health care, education, productivity, and entertainment and shopping.
kushnerha

BBC - Future - Will emoji become a new language? - 2 views

  • Emoji are now used in around half of every sentence on sites like Instagram, and Facebook looks set to introduce them alongside the famous “like” button as a way of expression your reaction to a post.
  • If you were to believe the headlines, this is just the tipping point: some outlets have claimed that emoji are an emerging language that could soon compete with English in global usage. To many, this would be an exciting evolution of the way we communicate; to others, it is linguistic Armageddon.
  • Do emoji show the same characteristics of other communicative systems and actual languages? And what do they help us to express that words alone can’t say?When emoji appear with text, they often supplement or enhance the writing. This is similar to gestures that appear along with speech. Over the past three decades, research has shown that our hands provide important information that often transcends and clarifies the message in speech. Emoji serve this function too – for instance, adding a kissy or winking face can disambiguate whether a statement is flirtatiously teasing or just plain mean.
  • ...17 more annotations...
  • This is a key point about language use: rarely is natural language ever limited to speech alone. When we are speaking, we constantly use gestures to illustrate what we mean. For this reason, linguists say that language is “multi-modal”. Writing takes away that extra non-verbal information, but emoji may allow us to re-incorporate it into our text.
  • Emoji are not always used as embellishments, however – sometimes, strings of the characters can themselves convey meaning in a longer sequence on their own. But to constitute their own language, they would need a key component: grammar.
  • A grammatical system is a set of constraints that governs how the meaning of an utterance is packaged in a coherent way. Natural language grammars have certain traits that distinguish them. For one, they have individual units that play different roles in the sequence – like nouns and verbs in a sentence. Also, grammar is different from meaning
  • When emoji are isolated, they are primarily governed by simple rules related to meaning alone, without these more complex rules. For instance, according to research by Tyler Schnoebelen, people often create strings of emoji that share a common meaning
  • This sequence has little internal structure; even when it is rearranged, it still conveys the same message. These images are connected solely by their broader meaning. We might consider them to be a visual list: “here are all things related to celebrations and birthdays.” Lists are certainly a conventionalised way of communicating, but they don’t have grammar the way that sentences do.
  • What if the order did matter though? What if they conveyed a temporal sequence of events? Consider this example, which means something like “a woman had a party where they drank, and then opened presents and then had cake”:
  • In all cases, the doer of the action (the agent) precedes the action. In fact, this pattern is commonly found in both full languages and simple communication systems. For example, the majority of the world’s languages place the subject before the verb of a sentence.
  • These rules may seem like the seeds of grammar, but psycholinguist Susan Goldin-Meadow and colleagues have found this order appears in many other systems that would not be considered a language. For example, this order appears when people arrange pictures to describe events from an animated cartoon, or when speaking adults communicate using only gestures. It also appears in the gesture systems created by deaf children who cannot hear spoken languages and are not exposed to sign languages.
  • describes the children as lacking exposure to a language and thus invent their own manual systems to communicate, called “homesigns”. These systems are limited in the size of their vocabularies and the types of sequences they can create. For this reason, the agent-act order seems not to be due to a grammar, but from basic heuristics – practical workarounds – based on meaning alone. Emoji seem to tap into this same system.
  • Nevertheless, some may argue that despite emoji’s current simplicity, this may be the groundwork for emerging complexity – that although emoji do not constitute a language at the present time, they could develop into one over time.
  • Could an emerging “emoji visual language” be developing in a similar way, with actual grammatical structure? To answer that question, you need to consider the intrinsic constraints on the technology itself.Emoji are created by typing into a computer like text. But, unlike text, most emoji are provided as whole units, except for the limited set of emoticons which convert to emoji, like :) or ;). When writing text, we use the building blocks (letters) to create the units (words), not by searching through a list of every whole word in the language.
  • emoji force us to convey information in a linear unit-unit string, which limits how complex expressions can be made. These constraints may mean that they will never be able to achieve even the most basic complexity that we can create with normal and natural drawings.
  • What’s more, these limits also prevent users from creating novel signs – a requisite for all languages, especially emerging ones. Users have no control over the development of the vocabulary. As the “vocab list” for emoji grows, it will become increasingly unwieldy: using them will require a conscious search process through an external list, not an easy generation from our own mental vocabulary, like the way we naturally speak or draw. This is a key point – it means that emoji lack the flexibility needed to create a new language.
  • we already have very robust visual languages, as can be seen in comics and graphic novels. As I argue in my book, The Visual Language of Comics, the drawings found in comics use a systematic visual vocabulary (such as stink lines to represent smell, or stars to represent dizziness). Importantly, the available vocabulary is not constrained by technology and has developed naturally over time, like spoken and written languages.
  • grammar of sequential images is more of a narrative structure – not of nouns and verbs. Yet, these sequences use principles of combination like any other grammar, including roles played by images, groupings of images, and hierarchic embedding.
  • measured participants’ brainwaves while they viewed sequences one image at a time where a disruption appeared either within the groupings of panels or at the natural break between groupings. The particular brainwave responses that we observed were similar to those that experimenters find when violating the syntax of sentences. That is, the brain responds the same way to violations of “grammar”, whether in sentences or sequential narrative images.
  • I would hypothesise that emoji can use a basic narrative structure to organise short stories (likely made up of agent-action sequences), but I highly doubt that they would be able to create embedded clauses like these. I would also doubt that you would see the same kinds of brain responses that we saw with the comic strip sequences.
ardenganse

Living With Aphantasia, the Inability to Make Mental Images - The New York Times - 0 views

  • Many educators believe visualization is key to reading comprehension since it allows readers to organize information in their minds, make inferences, and remember content more effectively.
  • Aphantasia is not a monolithic condition. People who believe they have aphantasia, known as aphants, debate in online groups about whether it should be deemed a disability. Some who are just finding out about their condition in their 50s or 60s say they never felt hindered, while others believe they failed courses in school because of it.
    • ardenganse
       
      It's interesting how people have different experiences with Aphantasia.
  • Not being able to visualize means never picturing the faces of family or close friends and remembering images as abstract information.
    • ardenganse
       
      We don't tend to realize how essential this is to our lives, whether or not our memories are actually reliable.
  • ...3 more annotations...
  • It might be while reminiscing about the past and realizing they’re having a different experience with memory than their friends or family.
  • Ultimately, aphantasia is just one of the many ways that people’s brains and learning styles are different.
  • When I close my eyes, all I see is faint blue dots and darkness, and for 19 years, I assumed that’s what everyone else saw too.
1 - 20 of 56 Next › Last »
Showing 20 items per page