Skip to main content

Home/ TOK Friends/ Group items tagged Neurological

Rss Feed Group items tagged

jongardner04

Ghost Illusion Created in the Lab | Neuroscience News Research Articles | Neuroscience ... - 0 views

  • Ghosts exist only in the mind, and scientists know just where to find them, an EPFL study suggests
  • In their experiment, Blanke’s team interfered with the sensorimotor input of participants in such a way that their brains no longer identified such signals as belonging to their own body, but instead interpreted them as th
  • ose of someone else.
  • ...3 more annotations...
  • The researchers first analyzed the brains of 12 patients with neurological disorders – mostly epilepsy – who have experienced this kind of “apparition.” MRI analysis of the patients’s brains revealed interference with three cortical regions: the insular cortex, parietal-frontal cortex, and the temporo-parietal cortex.
  • The participants were unaware of the experiment’s purpose.
  • Instinctively, several subjects reported a strong “feeling of a presence,” even counting up to four “ghosts” where none existed.
  •  
    Scientist performed an experiment creating an illusion of a ghost. This relates to the sense perception idea. 
  •  
    Scientist performed an experiment creating an illusion of a ghost. This relates to the sense perception idea. 
aqconces

What it will take for a head transplant to work - The Washington Post - 0 views

  • Sergio Canavero, an Italian neurosurgeon at the Turin Advanced Neuromodulation Group who has claimed that advances in medical science now make it possible to carry out head transplants that would allow patients to not only survive, but function normally
  • “I think we are now at a point when the technical aspects are all feasible,” Canavero told New Scientist.
  • While expert opinions on Canavero’s claims vary, the possibility isn’t as far fetched as it sounds. James Harrop, director of Adult Reconstructive Spine at Thomas Jefferson University in Philadelphia and co-editor of Congress of Neurological Surgeons, says that the kind of complications the surgeons faced back in 1970 could easily be fixed using today’s methods.
  • ...1 more annotation...
  • “Technically it’s not any harder than a liver and heart transplant,” he says
Emilio Ergueta

Notes Towards a Philosophy of Sleep | Issue 91 | Philosophy Now - 0 views

  • Meeting Christopher after a long interval reminded me of his excellent book Living Philosophy: Reflections on Life, Meaning and Morality (2001). The volume includes a fascinating essay entitled ‘The Need to Sleep’, where he notes that philosophers have not paid sufficient attention to this extraordinary phenomenon. Well, a decade on, this is the beginning of a response to Christopher’s wake-up call.
  • If I told you that I had a neurological disease which meant that for eight or more hours a day I lost control of my faculties, bade farewell to the outside world, and was subject to complex hallucinations and delusions – such as being chased by a grizzly bear at Stockport Railway Station – you would think I was in a pretty bad way.
  • Of course, sleep is not a disease at all, but the condition of daily (nightly) life for the vast majority of us. The fact that we accept without surprise the need for a prolonged black-out as part of our daily life highlights our tendency to take for granted anything about our condition that is universal.
  • ...7 more annotations...
  • Honest philosophers know they cannot complain about casting their philosophical pearls before drowsy swine, because they, too, have fallen asleep over the works of philosophers greater than themselves.
  • Not only is sleep a reminder of our ultimate helplessness, or even of how circumscribed a place thought sometimes plays in our lives, there is also the fear of contagion, as if talking about sleep might induce it – just as this reference to yawning will get at least 50% of you yawning in the next 15 minutes. (It’s a fact, honest!)
  • Since all animals sleep, we assume it has a biological purpose. The trouble is, we don’t know what that purpose is. There are many theories – energy conservation, growth promotion, immobilisation during hours of darkness when it might be dangerous to be out and about, consolidation of memories – but they are all open to serious objections.
  • Dreams, of course, have figured more significantly in philosophy. Being a mode of consciousness – prompting Aristotle to say that “the soul makes assertions in sleep” (On Dreams 458b) – dreams seem one step up from the mere putting out of zzzs.
  • they place a philosophically interesting question mark against our confidence in the nature of the world we appear to share with others.
  • Naturally, dreams preoccupied him as much as the daily resurrection of the self. He suggested that dreams might be an attempt to make sense of the body’s passage from sleep to wakefulness.
  • nothing is more sleep-inducing than the egocentric tales of someone else’s solipsistic dreams. We long to hear that magic phrase “And then I woke up.”
kushnerha

Aphantasia: A life without mental images - BBC News - 0 views

  • Most people can readily conjure images inside their head - known as their mind's eye. But this year scientists have described a condition, aphantasia, in which some people are unable to visualise mental images.
  • Our memories are often tied up in images, think back to a wedding or first day at school. As a result, Niel admits, some aspects of his memory are "terrible", but he is very good at remembering facts. And, like others with aphantasia, he struggles to recognise faces.Yet he does not see aphantasia as a disability, but simply a different way of experiencing life.
  • "When I think about my fiancee there is no image, but I am definitely thinking about her, I know today she has her hair up at the back, she's brunette."But I'm not describing an image I am looking at, I'm remembering features about her, that's the strangest thing and maybe that is a source of some regret."The response from his mates is a very sympathetic: "You're weird."
  • ...3 more annotations...
  • One person who took part in a study into aphantasia said he had started to feel "isolated" and "alone" after discovering that other people could see images in their heads. Being unable to reminisce about his mother years after her death led to him being "extremely distraught".
  • Adam Zeman, a professor of cognitive and behavioural neurology, wants to compare the lives and experiences of people with aphantasia and its polar-opposite hyperphantasia.His team, based at the University of Exeter, coined the term aphantasia this year
  • How we imagine is clearly very subjective - one person's vivid scene could be another's grainy picture.But Prof Zeman is certain that aphantasia is real. People often report being able to dream in pictures, and there have been reported cases of people losing the ability to think in images after a brain injury. He is adamant that aphantasia is "not a disorder" and says it may affect up to one in 50 people.
kushnerha

Which Type of Exercise Is Best for the Brain? - The New York Times - 1 views

  • Some forms of exercise may be much more effective than others at bulking up the brain, according to a remarkable new study in rats. For the first time, scientists compared head-to-head the neurological impacts of different types of exercise: running, weight training and high-intensity interval training. The surprising results suggest that going hard may not be the best option for long-term brain health.
  • exercise changes the structure and function of the brain. Studies in animals and people have shown that physical activity generally increases brain volume and can reduce the number and size of age-related holes in the brain’s white and gray matter.
  • Exercise also, and perhaps most resonantly, augments adult neurogenesis, which is the creation of new brain cells in an already mature brain. In studies with animals, exercise, in the form of running wheels or treadmills, has been found to double or even triple the number of new neurons that appear afterward in the animals’ hippocampus, a key area of the brain for learning and memory, compared to the brains of animals that remain sedentary. Scientists believe that exercise has similar impacts on the human hippocampus.
  • ...7 more annotations...
  • These past studies of exercise and neurogenesis understandably have focused on distance running. Lab rodents know how to run. But whether other forms of exercise likewise prompt increases in neurogenesis has been unknown and is an issue of increasing interest
  • new study, which was published this month in the Journal of Physiology, researchers at the University of Jyvaskyla in Finland and other institutions gathered a large group of adult male rats. The researchers injected the rats with a substance that marks new brain cells and then set groups of them to an array of different workouts, with one group remaining sedentary to serve as controls.
  • They found very different levels of neurogenesis, depending on how each animal had exercised. Those rats that had jogged on wheels showed robust levels of neurogenesis. Their hippocampal tissue teemed with new neurons, far more than in the brains of the sedentary animals. The greater the distance that a runner had covered during the experiment, the more new cells its brain now contained. There were far fewer new neurons in the brains of the animals that had completed high-intensity interval training. They showed somewhat higher amounts than in the sedentary animals but far less than in the distance runners. And the weight-training rats, although they were much stronger at the end of the experiment than they had been at the start, showed no discernible augmentation of neurogenesis. Their hippocampal tissue looked just like that of the animals that had not exercised at all.
  • “sustained aerobic exercise might be most beneficial for brain health also in humans.”
  • Just why distance running was so much more potent at promoting neurogenesis than the other workouts is not clear, although Dr. Nokia and her colleagues speculate that distance running stimulates the release of a particular substance in the brain known as brain-derived neurotrophic factor that is known to regulate neurogenesis. The more miles an animal runs, the more B.D.N.F. it produces. Weight training, on the other hand, while extremely beneficial for muscular health, has previously been shown to have little effect on the body’s levels of B.D.N.F.
  • As for high-intensity interval training, its potential brain benefits may be undercut by its very intensity, Dr. Nokia said. It is, by intent, much more physiologically draining and stressful than moderate running, and “stress tends to decrease adult hippocampal neurogenesis,” she said.
  • These results do not mean, however, that only running and similar moderate endurance workouts strengthen the brain, Dr. Nokia said. Those activities do seem to prompt the most neurogenesis in the hippocampus. But weight training and high-intensity intervals probably lead to different types of changes elsewhere in the brain. They might, for instance, encourage the creation of additional blood vessels or new connections between brain cells or between different parts of the brain.
kushnerha

Philosophy's True Home - The New York Times - 0 views

  • We’ve all heard the argument that philosophy is isolated, an “ivory tower” discipline cut off from virtually every other progress-making pursuit of knowledge, including math and the sciences, as well as from the actual concerns of daily life. The reasons given for this are many. In a widely read essay in this series, “When Philosophy Lost Its Way,” Robert Frodeman and Adam Briggle claim that it was philosophy’s institutionalization in the university in the late 19th century that separated it from the study of humanity and nature, now the province of social and natural sciences.
  • This institutionalization, the authors claim, led it to betray its central aim of articulating the knowledge needed to live virtuous and rewarding lives. I have a different view: Philosophy isn’t separated from the social, natural or mathematical sciences, nor is it neglecting the study of goodness, justice and virtue, which was never its central aim.
  • identified philosophy with informal linguistic analysis. Fortunately, this narrow view didn’t stop them from contributing to the science of language and the study of law. Now long gone, neither movement defined the philosophy of its day and neither arose from locating it in universities.
  • ...13 more annotations...
  • The authors claim that philosophy abandoned its relationship to other disciplines by creating its own purified domain, accessible only to credentialed professionals. It is true that from roughly 1930 to 1950, some philosophers — logical empiricists, in particular — did speak of philosophy having its own exclusive subject matter. But since that subject matter was logical analysis aimed at unifying all of science, interdisciplinarity was front and center.
  • philosopher-mathematicians Gottlob Frege, Bertrand Russell, Kurt Gödel, Alonzo Church and Alan Turing invented symbolic logic, helped establish the set-theoretic foundations of mathematics, and gave us the formal theory of computation that ushered in the digital age
  • developed ideas relating logic to linguistic meaning that provided a framework for studying meaning in all human languages. Others, including Paul Grice and J.L. Austin, explained how linguistic meaning mixes with contextual information to enrich communicative contents and how certain linguistic performances change social facts. Today a new philosophical conception of the relationship between meaning and cognition adds a further dimension to linguistic science.
  • Decision theory — the science of rational norms governing action, belief and decision under uncertainty — was developed by the 20th-century philosophers Frank Ramsey, Rudolph Carnap, Richard Jeffrey and others. It plays a foundational role in political science and economics by telling us what rationality requires, given our evidence, priorities and the strength of our beliefs. Today, no area of philosophy is more successful in attracting top young minds.
  • Philosophy also assisted psychology in its long march away from narrow behaviorism and speculative Freudianism. The mid-20th-century functionalist perspective pioneered by Hilary Putnam was particularly important. According to it, pain, pleasure and belief are neither behavioral dispositions nor bare neurological states. They are interacting internal causes, capable of very different physical realizations, that serve the goals of individuals in specific ways. This view is now embedded in cognitive psychology and neuroscience.
  • Philosophy also played a role in 20th-century physics, influencing the great physicists Albert Einstein, Niels Bohr and Werner Heisenberg. The philosophers Moritz Schlick and Hans Reichenbach reciprocated that interest by assimilating the new physics into their philosophies.
  • Philosophy of biology is following a similar path. Today’s philosophy of science is less accessible than Aristotle’s natural philosophy chiefly because it systematizes a larger, more technically sophisticated body of knowledge.
  • Philosophy’s interaction with mathematics, linguistics, economics, political science, psychology and physics requires specialization. Far from fostering isolation, this specialization makes communication and cooperation among disciplines possible. This has always been so.
  • Nor did scientific progress rob philosophy of its former scientific subject matter, leaving it to concentrate on the broadly moral. In fact, philosophy thrives when enough is known to make progress conceivable, but it remains unachieved because of methodological confusion. Philosophy helps break the impasse by articulating new questions, posing possible solutions and forging new conceptual tools.
  • Our knowledge of the universe and ourselves expands like a ripple surrounding a pebble dropped in a pool. As we move away from the center of the spreading circle, its area, representing our secure knowledge, grows. But so does its circumference, representing the border where knowledge blurs into uncertainty and speculation, and methodological confusion returns. Philosophy patrols the border, trying to understand how we got there and to conceptualize our next move.  Its job is unending.
  • Although progress in ethics, political philosophy and the illumination of life’s meaning has been less impressive than advances in some other areas, it is accelerating.
  • the advances in our understanding because of careful formulation and critical evaluation of theories of goodness, rightness, justice and human flourishing by philosophers since 1970 compare well to the advances made by philosophers from Aristotle to 1970
  • The knowledge required to maintain philosophy’s continuing task, including its vital connection to other disciplines, is too vast to be held in one mind. Despite the often-repeated idea that philosophy’s true calling can only be fulfilled in the public square, philosophers actually function best in universities, where they acquire and share knowledge with their colleagues in other disciplines. It is also vital for philosophers to engage students — both those who major in the subject, and those who do not. Although philosophy has never had a mass audience, it remains remarkably accessible to the average student; unlike the natural sciences, its frontiers can be reached in a few undergraduate courses.
kushnerha

Learning a New Sport May Be Good for the Brain - The New York Times - 0 views

  • Learning in midlife to juggle, swim, ride a bicycle or, in my case, snowboard could change and strengthen the brain in ways that practicing other familiar pursuits such as crossword puzzles or marathon training will not, according to an accumulating body of research about the unique impacts of motor learning on the brain.
  • Such complex thinking generally is classified as “higher-order” cognition and results in activity within certain portions of the brain and promotes plasticity, or physical changes, in those areas. There is strong evidence that learning a second language as an adult, for instance, results in increased white matter in the parts of the brain known to be involved in language processing.
  • Regular exercise likewise changes the brain, as I frequently have written, with studies in animals showing that running and other types of physical activities increase the number of new brain cells created in parts of the brain that are integral to memory and thinking.
  • ...5 more annotations...
  • But the impacts of learning on one of the most primal portions of the brain have been surprisingly underappreciated, both scientifically and outside the lab. Most of us pay little attention to our motor cortex, which controls how well we can move.
  • We like watching athletes in action, he said. But most of us make little effort to hone our motor skills in adulthood, and very few of us try to expand them by, for instance, learning a new sport. We could be short-changing our brains. Past neurological studies in people have shown that learning a new physical skill in adulthood, such as juggling, leads to increases in the volume of gray matter in parts of the brain related to movement control.
  • Even more compelling, a 2014 study with mice found that when the mice were introduced to a complicated type of running wheel, in which the rungs were irregularly spaced so that the animals had to learn a new, stutter-step type of running, their brains changed significantly. Learning to use these new wheels led to increased myelination of neurons in the animals’ motor cortexes. Myelination is the process by which parts of a brain cell are insulated, so that the messages between neurons can proceed more quickly and smoothly.
  • Scientists once believed that myelination in the brain occurs almost exclusively during infancy and childhood and then slows or halts altogether. But the animals running on the oddball wheels showed notable increases in the myelination of the neurons in their motor cortex even though they were adults.
  • In other words, learning the new skill had changed the inner workings of the adult animals’ motor cortexes; practicing a well-mastered one had not. “We don’t know” whether comparable changes occur within the brains of grown people who take up a new sport or physical skill, Dr. Krakauer said. But it seems likely, he said. “Motor skills are as cognitively challenging” in their way as traditional brainteasers such as crossword puzzles or brain-training games, he said. So adding a new sport to your repertory should have salutary effects on your brain, and also, unlike computer-based games, provide all the physical benefits of exercise.
kushnerha

BBC - Future - What Sherlock Holmes taught us about the mind - 0 views

  • The century-old detective stories are being studied by today’s neurologists – but why? As it turns out, not even modern technology can replace their lessons in rational thinking.
  • Arthur Conan Doyle was a physician himself, and there is evidence that he modelled the character of Holmes on one of the leading doctors of the day, Joseph Bell of the Royal Edinburgh Infirmary. “I thought I would try my hand at writing a story where the hero would treat crime as Dr Bell treated disease,”
  • Conan Doyle may have also drawn some inspiration from other doctors, such as William Gowers, who wrote the Bible of Neurology
  • ...11 more annotations...
  • Gowers often taught his students to begin their diagnosis from the moment a patient walked through the door
  • “Did you notice him as he came into the room? If you did not then you should have done so. One of the habits to be acquired and never omitted is to observe a patient as he enters the room; to note his aspect and his gait. If you did so, you would have seen that he seemed lame, and you may have been struck by that which must strike you now – an unusual tint of his face.”
  • the importance of the seemingly inconsequential that seems to inspire both men. “It has long been an axiom of mine that the little things are infinitely the most important,” Conan Doyle wrote
  • Both Gowers and Holmes also warned against letting your preconceptions fog your judgement. For both men, cool, unprejudiced observation was the order of the day. It is for this reason that Holmes chastises Watson in The Scandal of Bohemia: “You see, but you do not observe. The distinction is clear.”
  • Gowers: “The method you should adopt is this: Whenever you find yourself in the presence of a case that is not familiar to you in all its detail forget for a time all your types and all your names. Deal with the case as one that has never been seen before, and work it out as a new problem sui generis, to be investigated as such.”
  • both men “reasoned backwards”, for instance, dissecting all the possible paths that may have led to a particular disease (in Gowers’ case) or murder (in Holmes’)
  • Holmes’ most famous aphorism: “When you have eliminated the impossible, whatever remains, however improbable, must be the truth.”
  • the most important lesson to be learned, from both Gowers and Holmes, is the value of recognising your errors. “Gentlemen – It is always pleasant to be right, but it is generally a much more useful thing to be wrong,” wrote Gowers
  • This humility is key in beating the ‘curse of expertise’ that afflicts so many talented and intelligent people.
  • University College London has documented many instances in which apparent experts in both medicine and forensic science have allowed their own biases to cloud their judgements – sometimes even in life or death situations.
  • Even the most advanced technology can never replace the powers of simple observation and rational deduction.
Javier E

Opinion | How to Be More Resilient - The New York Times - 1 views

  • As a psychiatrist, I’ve long wondered why some people get ill in the face of stress and adversity — either mentally or physically — while others rarely succumb.
  • not everyone gets PTSD after exposure to extreme trauma, while some people get disabling depression with minimal or no stress
  • What makes people resilient, and is it something they are born with or can it be acquired later in life?
  • ...17 more annotations...
  • New research suggests that one possible answer can be found in the brain’s so-called central executive network, which helps regulate emotions, thinking and behavior
  • used M.R.I. to study the brains of a racially diverse group of 218 people, ages 12 to 14, living in violent neighborhoods in Chicago
  • the youths who had higher levels of functional connectivity in the central executive network had better cardiac and metabolic health than their peers with lower levels of connectivity
  • when neighborhood homicide rates went up, the young people’s cardiometabolic risk — as measured by obesity, blood-pressure and insulin levels, among other variables — also increased, but only in youths who showed lower activity in this brain network
  • “Active resilience happens when people who are vulnerable find resources to cope with stress and bounce back, and do so in a way that leaves them stronger, ready to handle additional stress, in more adaptive ways.”
  • the more medically hardy young people were no less anxious or depressed than their less fortunate peers, which suggests that while being more resilient makes you less vulnerable to adversity, it doesn’t guarantee happiness — or even an awareness of being resilient.
  • there is good reason to believe the link may be causal because other studies have found that we can change the activity in the self-control network, and increase healthy behaviors, with simple behavioral interventions
  • One plausible explanation is that greater activity in this network increases self-control, which most likely reduces some unhealthy behaviors people often use to cope with stress, like eating junk food or smoking
  • n one study, two weeks of mindfulness training produced a 60 percent reduction in smoking, compared with no reduction in a control group that focused on relaxation. An M.R.I. following mindfulness training showed increased activity in the anterior cingulate cortex and prefrontal cortex, key brain areas in the executive self-control network
  • Clearly self-control is one critical component of resilience that can be easily fostered. But there are others.
  • For example, mindfulness training, which involves attention control, emotion regulation and increased self-awareness, can increase connectivity within this network and help people to quit smoking.
  • she and colleagues studied the brains of depressed patients who died. They found that the most disrupted genes were those for growth factors, proteins that act like a kind of brain fertilizer.
  • “We came to realize that depressed people have lost their power to remodel their brains. And that is in fact devastating because brain remodeling is something we need to do all the time — we are constantly rewiring our brains based on past experience and the expectation of how we need to use them in the future,
  • one growth factor that is depleted in depressed brains, called fibroblast growth factor 2, also plays a role in resilience. When they gave it to stressed animals, they bounced back faster and acted less depressed. And when they gave it just once after birth to animals that had been bred for high levels of anxiety and inhibition, they were hardier for the rest of their lives.
  • The good news is that we have some control over our own brain BDNF levels: Getting more physical exercise and social support, for example, has been shown to increase BDNF.
  • Perhaps someday we might be able to protect young people exposed to violence and adversity by supplementing them with neuroprotective growth factors. We know enough now to help them by fortifying their brains through exercise, mindfulness training and support systems
  • Some people have won the genetic sweepstakes and are naturally tough. But there is plenty the rest of us can do to be more resilient and healthier.
maxwellokolo

Research Seeks to Boost Memory and Performance with Targeted Electrical Stimulation - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

Non-O Blood Groups Associated With Higher Risk of Heart Attack - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
Javier E

They're Watching You at Work - Don Peck - The Atlantic - 2 views

  • Predictive statistical analysis, harnessed to big data, appears poised to alter the way millions of people are hired and assessed.
  • By one estimate, more than 98 percent of the world’s information is now stored digitally, and the volume of that data has quadrupled since 2007.
  • The application of predictive analytics to people’s careers—an emerging field sometimes called “people analytics”—is enormously challenging, not to mention ethically fraught
  • ...52 more annotations...
  • By the end of World War II, however, American corporations were facing severe talent shortages. Their senior executives were growing old, and a dearth of hiring from the Depression through the war had resulted in a shortfall of able, well-trained managers. Finding people who had the potential to rise quickly through the ranks became an overriding preoccupation of American businesses. They began to devise a formal hiring-and-management system based in part on new studies of human behavior, and in part on military techniques developed during both world wars, when huge mobilization efforts and mass casualties created the need to get the right people into the right roles as efficiently as possible. By the 1950s, it was not unusual for companies to spend days with young applicants for professional jobs, conducting a battery of tests, all with an eye toward corner-office potential.
  • But companies abandoned their hard-edged practices for another important reason: many of their methods of evaluation turned out not to be very scientific.
  • this regime, so widespread in corporate America at mid-century, had almost disappeared by 1990. “I think an HR person from the late 1970s would be stunned to see how casually companies hire now,”
  • Many factors explain the change, he said, and then he ticked off a number of them: Increased job-switching has made it less important and less economical for companies to test so thoroughly. A heightened focus on short-term financial results has led to deep cuts in corporate functions that bear fruit only in the long term. The Civil Rights Act of 1964, which exposed companies to legal liability for discriminatory hiring practices, has made HR departments wary of any broadly applied and clearly scored test that might later be shown to be systematically biased.
  • about a quarter of the country’s corporations were using similar tests to evaluate managers and junior executives, usually to assess whether they were ready for bigger roles.
  • He has encouraged the company’s HR executives to think about applying the games to the recruitment and evaluation of all professional workers.
  • Knack makes app-based video games, among them Dungeon Scrawl, a quest game requiring the player to navigate a maze and solve puzzles, and Wasabi Waiter, which involves delivering the right sushi to the right customer at an increasingly crowded happy hour. These games aren’t just for play: they’ve been designed by a team of neuroscientists, psychologists, and data scientists to suss out human potential. Play one of them for just 20 minutes, says Guy Halfteck, Knack’s founder, and you’ll generate several megabytes of data, exponentially more than what’s collected by the SAT or a personality test. How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality. The end result, Halfteck says, is a high-resolution portrait of your psyche and intellect, and an assessment of your potential as a leader or an innovator.
  • When the results came back, Haringa recalled, his heart began to beat a little faster. Without ever seeing the ideas, without meeting or interviewing the people who’d proposed them, without knowing their title or background or academic pedigree, Knack’s algorithm had identified the people whose ideas had panned out. The top 10 percent of the idea generators as predicted by Knack were in fact those who’d gone furthest in the process.
  • What Knack is doing, Haringa told me, “is almost like a paradigm shift.” It offers a way for his GameChanger unit to avoid wasting time on the 80 people out of 100—nearly all of whom look smart, well-trained, and plausible on paper—whose ideas just aren’t likely to work out.
  • Aptitude, skills, personal history, psychological stability, discretion, loyalty—companies at the time felt they had a need (and the right) to look into them all. That ambit is expanding once again, and this is undeniably unsettling. Should the ideas of scientists be dismissed because of the way they play a game? Should job candidates be ranked by what their Web habits say about them? Should the “data signature” of natural leaders play a role in promotion? These are all live questions today, and they prompt heavy concerns: that we will cede one of the most subtle and human of skills, the evaluation of the gifts and promise of other people, to machines; that the models will get it wrong; that some people will never get a shot in the new workforce.
  • scoring distance from work could violate equal-employment-opportunity standards. Marital status? Motherhood? Church membership? “Stuff like that,” Meyerle said, “we just don’t touch”—at least not in the U.S., where the legal environment is strict. Meyerle told me that Evolv has looked into these sorts of factors in its work for clients abroad, and that some of them produce “startling results.”
  • consider the alternative. A mountain of scholarly literature has shown that the intuitive way we now judge professional potential is rife with snap judgments and hidden biases, rooted in our upbringing or in deep neurological connections that doubtless served us well on the savanna but would seem to have less bearing on the world of work.
  • We may like to think that society has become more enlightened since those days, and in many ways it has, but our biases are mostly unconscious, and they can run surprisingly deep. Consider race. For a 2004 study called “Are Emily and Greg More Employable Than Lakisha and Jamal?,” the economists Sendhil Mullainathan and Marianne Bertrand put white-sounding names (Emily Walsh, Greg Baker) or black-sounding names (Lakisha Washington, Jamal Jones) on similar fictitious résumés, which they then sent out to a variety of companies in Boston and Chicago. To get the same number of callbacks, they learned, they needed to either send out half again as many résumés with black names as those with white names, or add eight extra years of relevant work experience to the résumés with black names.
  • a sociologist at Northwestern, spent parts of the three years from 2006 to 2008 interviewing professionals from elite investment banks, consultancies, and law firms about how they recruited, interviewed, and evaluated candidates, and concluded that among the most important factors driving their hiring recommendations were—wait for it—shared leisure interests.
  • Lacking “reliable predictors of future performance,” Rivera writes, “assessors purposefully used their own experiences as models of merit.” Former college athletes “typically prized participation in varsity sports above all other types of involvement.” People who’d majored in engineering gave engineers a leg up, believing they were better prepared.
  • the prevailing system of hiring and management in this country involves a level of dysfunction that should be inconceivable in an economy as sophisticated as ours. Recent survey data collected by the Corporate Executive Board, for example, indicate that nearly a quarter of all new hires leave their company within a year of their start date, and that hiring managers wish they’d never extended an offer to one out of every five members on their team
  • In the late 1990s, as these assessments shifted from paper to digital formats and proliferated, data scientists started doing massive tests of what makes for a successful customer-support technician or salesperson. This has unquestionably improved the quality of the workers at many firms.
  • In 2010, however, Xerox switched to an online evaluation that incorporates personality testing, cognitive-skill assessment, and multiple-choice questions about how the applicant would handle specific scenarios that he or she might encounter on the job. An algorithm behind the evaluation analyzes the responses, along with factual information gleaned from the candidate’s application, and spits out a color-coded rating: red (poor candidate), yellow (middling), or green (hire away). Those candidates who score best, I learned, tend to exhibit a creative but not overly inquisitive personality, and participate in at least one but not more than four social networks, among many other factors. (Previous experience, one of the few criteria that Xerox had explicitly screened for in the past, turns out to have no bearing on either productivity or retention
  • When Xerox started using the score in its hiring decisions, the quality of its hires immediately improved. The rate of attrition fell by 20 percent in the initial pilot period, and over time, the number of promotions rose. Xerox still interviews all candidates in person before deciding to hire them, Morse told me, but, she added, “We’re getting to the point where some of our hiring managers don’t even want to interview anymore”
  • Gone are the days, Ostberg told me, when, say, a small survey of college students would be used to predict the statistical validity of an evaluation tool. “We’ve got a data set of 347,000 actual employees who have gone through these different types of assessments or tools,” he told me, “and now we have performance-outcome data, and we can split those and slice and dice by industry and location.”
  • Evolv’s tests allow companies to capture data about everybody who applies for work, and everybody who gets hired—a complete data set from which sample bias, long a major vexation for industrial-organization psychologists, simply disappears. The sheer number of observations that this approach makes possible allows Evolv to say with precision which attributes matter more to the success of retail-sales workers (decisiveness, spatial orientation, persuasiveness) or customer-service personnel at call centers (rapport-building)
  • There are some data that Evolv simply won’t use, out of a concern that the information might lead to systematic bias against whole classes of people
  • the idea that hiring was a science fell out of favor. But now it’s coming back, thanks to new technologies and methods of analysis that are cheaper, faster, and much-wider-ranging than what we had before
  • what most excites him are the possibilities that arise from monitoring the entire life cycle of a worker at any given company.
  • Now the two companies are working together to marry pre-hire assessments to an increasing array of post-hire data: about not only performance and duration of service but also who trained the employees; who has managed them; whether they were promoted to a supervisory role, and how quickly; how they performed in that role; and why they eventually left.
  • What begins with an online screening test for entry-level workers ends with the transformation of nearly every aspect of hiring, performance assessment, and management.
  • I turned to Sandy Pentland, the director of the Human Dynamics Laboratory at MIT. In recent years, Pentland has pioneered the use of specialized electronic “badges” that transmit data about employees’ interactions as they go about their days. The badges capture all sorts of information about formal and informal conversations: their length; the tone of voice and gestures of the people involved; how much those people talk, listen, and interrupt; the degree to which they demonstrate empathy and extroversion; and more. Each badge generates about 100 data points a minute.
  • he tried the badges out on about 2,500 people, in 21 different organizations, and learned a number of interesting lessons. About a third of team performance, he discovered, can usually be predicted merely by the number of face-to-face exchanges among team members. (Too many is as much of a problem as too few.) Using data gathered by the badges, he was able to predict which teams would win a business-plan contest, and which workers would (rightly) say they’d had a “productive” or “creative” day. Not only that, but he claimed that his researchers had discovered the “data signature” of natural leaders, whom he called “charismatic connectors” and all of whom, he reported, circulate actively, give their time democratically to others, engage in brief but energetic conversations, and listen at least as much as they talk.
  • His group is developing apps to allow team members to view their own metrics more or less in real time, so that they can see, relative to the benchmarks of highly successful employees, whether they’re getting out of their offices enough, or listening enough, or spending enough time with people outside their own team.
  • Torrents of data are routinely collected by American companies and now sit on corporate servers, or in the cloud, awaiting analysis. Bloomberg reportedly logs every keystroke of every employee, along with their comings and goings in the office. The Las Vegas casino Harrah’s tracks the smiles of the card dealers and waitstaff on the floor (its analytics team has quantified the impact of smiling on customer satisfaction). E‑mail, of course, presents an especially rich vein to be mined for insights about our productivity, our treatment of co-workers, our willingness to collaborate or lend a hand, our patterns of written language, and what those patterns reveal about our intelligence, social skills, and behavior.
  • people analytics will ultimately have a vastly larger impact on the economy than the algorithms that now trade on Wall Street or figure out which ads to show us. He reminded me that we’ve witnessed this kind of transformation before in the history of management science. Near the turn of the 20th century, both Frederick Taylor and Henry Ford famously paced the factory floor with stopwatches, to improve worker efficiency.
  • “The quantities of data that those earlier generations were working with,” he said, “were infinitesimal compared to what’s available now. There’s been a real sea change in the past five years, where the quantities have just grown so large—petabytes, exabytes, zetta—that you start to be able to do things you never could before.”
  • People analytics will unquestionably provide many workers with more options and more power. Gild, for example, helps companies find undervalued software programmers, working indirectly to raise those people’s pay. Other companies are doing similar work. One called Entelo, for instance, specializes in using algorithms to identify potentially unhappy programmers who might be receptive to a phone cal
  • He sees it not only as a boon to a business’s productivity and overall health but also as an important new tool that individual employees can use for self-improvement: a sort of radically expanded The 7 Habits of Highly Effective People, custom-written for each of us, or at least each type of job, in the workforce.
  • the most exotic development in people analytics today is the creation of algorithms to assess the potential of all workers, across all companies, all the time.
  • The way Gild arrives at these scores is not simple. The company’s algorithms begin by scouring the Web for any and all open-source code, and for the coders who wrote it. They evaluate the code for its simplicity, elegance, documentation, and several other factors, including the frequency with which it’s been adopted by other programmers. For code that was written for paid projects, they look at completion times and other measures of productivity. Then they look at questions and answers on social forums such as Stack Overflow, a popular destination for programmers seeking advice on challenging projects. They consider how popular a given coder’s advice is, and how widely that advice ranges.
  • The algorithms go further still. They assess the way coders use language on social networks from LinkedIn to Twitter; the company has determined that certain phrases and words used in association with one another can distinguish expert programmers from less skilled ones. Gild knows these phrases and words are associated with good coding because it can correlate them with its evaluation of open-source code, and with the language and online behavior of programmers in good positions at prestigious companies.
  • having made those correlations, Gild can then score programmers who haven’t written open-source code at all, by analyzing the host of clues embedded in their online histories. They’re not all obvious, or easy to explain. Vivienne Ming, Gild’s chief scientist, told me that one solid predictor of strong coding is an affinity for a particular Japanese manga site.
  • Gild’s CEO, Sheeroy Desai, told me he believes his company’s approach can be applied to any occupation characterized by large, active online communities, where people post and cite individual work, ask and answer professional questions, and get feedback on projects. Graphic design is one field that the company is now looking at, and many scientific, technical, and engineering roles might also fit the bill. Regardless of their occupation, most people leave “data exhaust” in their wake, a kind of digital aura that can reveal a lot about a potential hire.
  • professionally relevant personality traits can be judged effectively merely by scanning Facebook feeds and photos. LinkedIn, of course, captures an enormous amount of professional data and network information, across just about every profession. A controversial start-up called Klout has made its mission the measurement and public scoring of people’s online social influence.
  • Mullainathan expressed amazement at how little most creative and professional workers (himself included) know about what makes them effective or ineffective in the office. Most of us can’t even say with any certainty how long we’ve spent gathering information for a given project, or our pattern of information-gathering, never mind know which parts of the pattern should be reinforced, and which jettisoned. As Mullainathan put it, we don’t know our own “production function.”
  • Over time, better job-matching technologies are likely to begin serving people directly, helping them see more clearly which jobs might suit them and which companies could use their skills. In the future, Gild plans to let programmers see their own profiles and take skills challenges to try to improve their scores. It intends to show them its estimates of their market value, too, and to recommend coursework that might allow them to raise their scores even more. Not least, it plans to make accessible the scores of typical hires at specific companies, so that software engineers can better see the profile they’d need to land a particular job
  • Knack, for its part, is making some of its video games available to anyone with a smartphone, so people can get a better sense of their strengths, and of the fields in which their strengths would be most valued. (Palo Alto High School recently adopted the games to help students assess careers.) Ultimately, the company hopes to act as matchmaker between a large network of people who play its games (or have ever played its games) and a widening roster of corporate clients, each with its own specific profile for any given type of job.
  • When I began my reporting for this story, I was worried that people analytics, if it worked at all, would only widen the divergent arcs of our professional lives, further gilding the path of the meritocratic elite from cradle to grave, and shutting out some workers more definitively. But I now believe the opposite is likely to happen, and that we’re headed toward a labor market that’s fairer to people at every stage of their careers
  • For decades, as we’ve assessed people’s potential in the professional workforce, the most important piece of data—the one that launches careers or keeps them grounded—has been educational background: typically, whether and where people went to college, and how they did there. Over the past couple of generations, colleges and universities have become the gatekeepers to a prosperous life. A degree has become a signal of intelligence and conscientiousness, one that grows stronger the more selective the school and the higher a student’s GPA, that is easily understood by employers, and that, until the advent of people analytics, was probably unrivaled in its predictive powers.
  • the limitations of that signal—the way it degrades with age, its overall imprecision, its many inherent biases, its extraordinary cost—are obvious. “Academic environments are artificial environments,” Laszlo Bock, Google’s senior vice president of people operations, told The New York Times in June. “People who succeed there are sort of finely trained, they’re conditioned to succeed in that environment,” which is often quite different from the workplace.
  • because one’s college history is such a crucial signal in our labor market, perfectly able people who simply couldn’t sit still in a classroom at the age of 16, or who didn’t have their act together at 18, or who chose not to go to graduate school at 22, routinely get left behind for good. That such early factors so profoundly affect career arcs and hiring decisions made two or three decades later is, on its face, absurd.
  • I spoke with managers at a lot of companies who are using advanced analytics to reevaluate and reshape their hiring, and nearly all of them told me that their research is leading them toward pools of candidates who didn’t attend college—for tech jobs, for high-end sales positions, for some managerial roles. In some limited cases, this is because their analytics revealed no benefit whatsoever to hiring people with college degrees; in other cases, and more often, it’s because they revealed signals that function far better than college history,
  • Google, too, is hiring a growing number of nongraduates. Many of the people I talked with reported that when it comes to high-paying and fast-track jobs, they’re reducing their preference for Ivy Leaguers and graduates of other highly selective schools.
  • This process is just beginning. Online courses are proliferating, and so are online markets that involve crowd-sourcing. Both arenas offer new opportunities for workers to build skills and showcase competence. Neither produces the kind of instantly recognizable signals of potential that a degree from a selective college, or a first job at a prestigious firm, might. That’s a problem for traditional hiring managers, because sifting through lots of small signals is so difficult and time-consuming.
  • all of these new developments raise philosophical questions. As professional performance becomes easier to measure and see, will we become slaves to our own status and potential, ever-focused on the metrics that tell us how and whether we are measuring up? Will too much knowledge about our limitations hinder achievement and stifle our dreams? All I can offer in response to these questions, ironically, is my own gut sense, which leads me to feel cautiously optimistic.
  • Google’s understanding of the promise of analytics is probably better than anybody else’s, and the company has been changing its hiring and management practices as a result of its ongoing analyses. (Brainteasers are no longer used in interviews, because they do not correlate with job success; GPA is not considered for anyone more than two years out of school, for the same reason—the list goes on.) But for all of Google’s technological enthusiasm, these same practices are still deeply human. A real, live person looks at every résumé the company receives. Hiring decisions are made by committee and are based in no small part on opinions formed during structured interviews.
maxwellokolo

Why Playing a Musical Instrument Can Protect Brain Health - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

New 'GPS' Neurons Discovered - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

Puberty Hormones Trigger Changes in Youthful Learning - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

Researchers Rewire Brain of One Species to Have Connectivity of Another - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

Cracking the Brain's Memory Codes - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

A Non-Invasive Method for Deep Brain Stimulation - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
maxwellokolo

Social Laughter Releases Endorphins in the Brain - 0 views

  •  
    Neuroscience News has recent neuroscience research articles, brain research news, neurology studies and neuroscience resources for neuroscientists, students, and science fans and is always free to join. Our neuroscience social network has science groups, discussion forums, free books, resources, science videos and more.
Javier E

The Epidemic of Facelessness - NYTimes.com - 1 views

  • The fact that the case ended up in court is rare; the viciousness it represents is not. Everyone in the digital space is, at one point or another, exposed to online monstrosity, one of the consequences of the uniquely contemporary condition of facelessness.
  • There is a vast dissonance between virtual communication and an actual police officer at the door. It is a dissonance we are all running up against more and more, the dissonance between the world of faces and the world without faces. And the world without faces is coming to dominate.
  • Inability to see a face is, in the most direct way, inability to recognize shared humanity with another. In a metastudy of antisocial populations, the inability to sense the emotions on other people’s faces was a key correlation. There is “a consistent, robust link between antisocial behavior and impaired recognition of fearful facial affect. Relative to comparison groups, antisocial populations showed significant impairments in recognizing fearful, sad and surprised expressions.”
  • ...16 more annotations...
  • the faceless communication social media creates, the linked distances between people, both provokes and mitigates the inherent capacity for monstrosity.
  • The Gyges effect, the well-noted disinhibition created by communications over the distances of the Internet, in which all speech and image are muted and at arm’s reach, produces an inevitable reaction — the desire for impact at any cost, the desire to reach through the screen, to make somebody feel something, anything. A simple comment can so easily be ignored. Rape threat? Not so much. Or, as Mr. Nunn so succinctly put it on Twitter: “If you can’t threaten to rape a celebrity, what is the point in having them?”
  • The challenge of our moment is that the face has been at the root of justice and ethics for 2,000 years.
  • The precondition of any trial, of any attempt to reconcile competing claims, is that the victim and the accused look each other in the face.
  • For the great French-Jewish philosopher Emmanuel Levinas, the encounter with another’s face was the origin of identity — the reality of the other preceding the formation of the self. The face is the substance, not just the reflection, of the infinity of another person. And from the infinity of the face comes the sense of inevitable obligation, the possibility of discourse, the origin of the ethical impulse.
  • “Through imitation and mimicry, we are able to feel what other people feel. By being able to feel what other people feel, we are also able to respond compassionately to other people’s emotional states.” The face is the key to the sense of intersubjectivity, linking mimicry and empathy through mirror neurons — the brain mechanism that creates imitation even in nonhuman primates.
  • it’s also no mere technical error on the part of Twitter; faceless rage is inherent to its technology.
  • Without a face, the self can form only with the rejection of all otherness, with a generalized, all-purpose contempt — a contempt that is so vacuous because it is so vague, and so ferocious because it is so vacuous. A world stripped of faces is a world stripped, not merely of ethics, but of the biological and cultural foundations of ethics.
  • The spirit of facelessness is coming to define the 21st. Facelessness is not a trend; it is a social phase we are entering that we have not yet figured out how to navigate.
  • the flight back to the face takes on new urgency. Google recently reported that on Android alone, which has more than a billion active users, people take 93 million selfies a day
  • Emojis are an explicit attempt to replicate the emotional context that facial expression provides. Intriguingly, emojis express emotion, often negative emotions, but you cannot troll with them.
  • But all these attempts to provide a digital face run counter to the main current of our era’s essential facelessness. The volume of digital threats appears to be too large for police forces to adequately deal with.
  • The more established wisdom about trolls, at this point, is to disengage. Obviously, in many cases, actual crimes are being committed, crimes that demand confrontation, by victims and by law enforcement officials, but in everyday digital life engaging with the trolls “is like trying to drown a vampire with your own blood,”
  • There is a third way, distinct from confrontation or avoidance: compassion
  • we need a new art of conversation for the new conversations we are having — and the first rule of that art must be to remember that we are talking to human beings: “Never say anything online that you wouldn’t say to somebody’s face.” But also: “Don’t listen to what people wouldn’t say to your face.”
  • The neurological research demonstrates that empathy, far from being an artificial construct of civilization, is integral to our biology.
« First ‹ Previous 41 - 60 of 80 Next ›
Showing 20 items per page