Skip to main content

Home/ TOK Friends/ Group items tagged iq

Rss Feed Group items tagged

lenaurick

IQ can predict your risk of death, and 8 other smart facts about intelligence - Vox - 0 views

  • But according to Stuart Ritchie, an intelligence researcher at the University of Edinburgh, there's a massive amount of data showing that it's one of the best predictors of someone's longevity, health, and prosperity
  • In a new book, Intelligence: All that Matters, Ritchie persuasively argues that IQ doesn't necessarily set the limit for what we can do, but it does give us a starting point
  • Most people you meet are probably average, and a few are extraordinarily smart. Just 2.2 percent have an IQ of 130 or greate
  • ...17 more annotations...
  • "The classic finding — I would say it is the most replicated finding in psychology — is that people who are good at one type of mental task tend to be good at them all,"
  • G-factor is real in the sense it can predict outcomes in our lives — how much money you'll make, how productive of a worker you might be, and, most chillingly, how likely you are to die an earlier death.
  • According to the research, people with high IQs tend to be healthier and live longer than the rest of us
  • One is the fact that people with higher IQs tend to make more money than people with lower scores. Money is helpful in maintaining weight, nutrition, and accessing good health care.
  • IQ often beats personality when it comes to predicting life outcomes: Personality traits, a recent study found, can explain about 4 percent of the variance in test scores for students under age 16. IQ can explain 25 percent, or an even higher proportion, depending on the study.
  • Many of these correlations are less than .5, which means there's plenty of room for individual differences. So, yes, very smart people who are awful at their jobs exist. You're just less likely to come across them.
  • The correlation between IQ and happiness is usually positive, but also usually smaller than one might expect (and sometimes not statistically significant)," Ritchie says.
  • It could also be that people with higher IQs are smart enough to avoid accidents and mishaps. There's actually some evidence to support this: Higher-IQ people are less likely to die in traffic accidents.
  • Even though intelligence generally declines with age, those who had high IQs as children were most likely to retain their smarts as very old people.
  • "If we know the genes related to intelligence — and we know these genes are related to cognitive decline as well — then we can start to a predict who is going to have the worst cognitive decline, and devote health care medical resources to them," he says.
  • Studies comparing identical and fraternal twins find about half of IQ can be explained by genetics.
  • genetics seems to become more predictive of IQ with age.
  • The idea is as we age, we grow more in control of our environments. Those environments we create can then "amplify" the potential of our genes.
  • About half the variability in IQ is attributed to the environment. Access to nutrition, education, and health care appear to play a big role.
  • People’s lives are really messy, and the environments they are in are messy. There’s a possibility that a lot of the environmental effect on a person’s intelligence is random."
  • Hurray! Mean IQ scores appear to be increasing between 2 and 3 points per decade.
  • This phenomenon is know as the Flynn effect, and it is likely the result of increasing quality of childhood nutrition, health care, and education.
kushnerha

BBC - Future - The surprising downsides of being clever - 0 views

  • If ignorance is bliss, does a high IQ equal misery? Popular opinion would have it so. We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest. As Ernest Hemingway wrote: “Happiness in intelligent people is the rarest thing I know.”
  • Combing California’s schools for the creme de la creme, he selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the “Termites”, and the highs and lows of their lives are still being studied to this day.
  • Termites’ average salary was twice that of the average white-collar job. But not all the group met Terman’s expectations – there were many who pursued more “humble” professions such as police officers, seafarers, and typists. For this reason, Terman concluded that “intellect and achievement are far from perfectly correlated”. Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average.
  • ...16 more annotations...
  • One possibility is that knowledge of your talents becomes something of a ball and chain. Indeed, during the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations.
  • The most notable, and sad, case concerns the maths prodigy Sufiah Yusof. Enrolled at Oxford University aged 12, she dropped out of her course before taking her finals and started waitressing. She later worked as a call girl, entertaining clients with her ability to recite equations during sexual acts.
  • Another common complaint, often heard in student bars and internet forums, is that smarter people somehow have a clearer vision of the world’s failings. Whereas the rest of us are blinkered from existential angst, smarter people lay awake agonising over the human condition or other people’s folly.
  • MacEwan University in Canada found that those with the higher IQ did indeed feel more anxiety throughout the day. Interestingly, most worries were mundane, day-to-day concerns, though; the high-IQ students were far more likely to be replaying an awkward conversation, than asking the “big questions”. “It’s not that their worries were more profound, but they are just worrying more often about more things,” says Penney. “If something negative happened, they thought about it more.”
  • seemed to correlate with verbal intelligence – the kind tested by word games in IQ tests, compared to prowess at spatial puzzles (which, in fact, seemed to reduce the risk of anxiety). He speculates that greater eloquence might also make you more likely to verbalise anxieties and ruminate over them. It’s not necessarily a disadvantage, though. “Maybe they were problem-solving a bit more than most people,” he says – which might help them to learn from their mistakes.
  • The harsh truth, however, is that greater intelligence does not equate to wiser decisions; in fact, in some cases it might make your choices a little more foolish.
  • we need to turn our minds to an age-old concept: “wisdom”. His approach is more scientific that it might at first sound. “The concept of wisdom has an ethereal quality to it,” he admits. “But if you look at the lay definition of wisdom, many people would agree it’s the idea of someone who can make good unbiased judgement.”
  • “my-side bias” – our tendency to be highly selective in the information we collect so that it reinforces our previous attitudes. The more enlightened approach would be to leave your assumptions at the door as you build your argument – but Stanovich found that smarter people are almost no more likely to do so than people with distinctly average IQs.
  • People who ace standard cognitive tests are in fact slightly more likely to have a “bias blind spot”. That is, they are less able to see their own flaws, even when though they are quite capable of criticising the foibles of others. And they have a greater tendency to fall for the “gambler’s fallacy”
  • A tendency to rely on gut instincts rather than rational thought might also explain why a surprisingly high number of Mensa members believe in the paranormal; or why someone with an IQ of 140 is about twice as likely to max out their credit card.
  • “The people pushing the anti-vaccination meme on parents and spreading misinformation on websites are generally of more than average intelligence and education.” Clearly, clever people can be dangerously, and foolishly, misguided.
  • spent the last decade building tests for rationality, and he has found that fair, unbiased decision-making is largely independent of IQ.
  • Crucially, Grossmann found that IQ was not related to any of these measures, and certainly didn’t predict greater wisdom. “People who are very sharp may generate, very quickly, arguments [for] why their claims are the correct ones – but may do it in a very biased fashion.”
  • employers may well begin to start testing these abilities in place of IQ; Google has already announced that it plans to screen candidates for qualities like intellectual humility, rather than sheer cognitive prowess.
  • He points out that we often find it easier to leave our biases behind when we consider other people, rather than ourselves. Along these lines, he has found that simply talking through your problems in the third person (“he” or “she”, rather than “I”) helps create the necessary emotional distance, reducing your prejudices and leading to wiser arguments.
  • If you’ve been able to rest on the laurels of your intelligence all your life, it could be very hard to accept that it has been blinding your judgement. As Socrates had it: the wisest person really may be the one who can admit he knows nothing.
oliviaodon

Why Are Some People So Smart? The Answer Could Spawn a Generation of Superbabies | WIRED - 0 views

  • use those machines to examine the genetic underpinnings of genius like his own. He wants nothing less than to crack the code for intelligence by studying the genomes of thousands of prodigies, not just from China but around the world.
  • fully expect they will succeed in identifying a genetic basis for IQ. They also expect that within a decade their research will be used to screen embryos during in vitro fertilization, boosting the IQ of unborn children by up to 20 points. In theory, that’s the difference between a kid who struggles through high school and one who sails into college.
  • studies make it clear that IQ is strongly correlated with the ability to solve all sorts of abstract problems, whether they involve language, math, or visual patterns. The frightening upshot is that IQ remains by far the most powerful predictor of the life outcomes that people care most about in the modern world. Tell me your IQ and I can make a decently accurate prediction of your occupational attainment, how many kids you’ll have, your chances of being arrested for a crime, even how long you’ll live.
  • ...6 more annotations...
  • Dozens of popular books by nonexperts have filled the void, many claiming that IQ—which after more than a century remains the dominant metric for intelligence—predicts nothing important or that intelligence is simply too complex and subtle to be measured.
  • evidence points toward a strong genetic component in IQ. Based on studies of twins, siblings, and adoption, contemporary estimates put the heritability of IQ at 50 to 80 percent
  • intelligence has a genetic recipe
  • “Do you know any Perl?” Li asked him. Perl is a programming language often used to analyze genomic data. Zhao admitted he did not; in fact, he had no programming skills at all. Li handed him a massive textbook, Programming Perl. There were only two weeks left in the camp, so this would get rid of the kid for good. A few days later, Zhao returned. “I finished it,” he said. “The problems are kind of boring. Do you have anything harder?” Perl is a famously complicated language that takes university students a full year to learn.
  • So Li gave him a large DNA data set and a complicated statistical problem. That should do it. But Zhao returned later that day. “Finished.” Not only was it finished—and correct—but Zhao had even built a slick interface on top of the data.
  • driven by a fascination with kids who are born smart; he wants to know what makes them—and by extension, himself—the way they are.
  •  
    This is a really interesting article about using science to improve intelligence.
Emily Freilich

Are You Smarter Than Your Grandfather? Probably Not. | Science | Smithsonian - 1 views

  • IQ test scores had significantly risen from one generation to the nex
  • widespread increase in IQ scores, and reveals some new ones, regarding teenagers’ vocabularies and the mental decline of the extremely bright in old age. Ultimately, Flynn concludes that human beings are not smarter—just more modern
  • there is a subtest called “similarities,” which asks questions like, what do dogs and rabbits have in common? Or what do truth and beauty have in common? On this subtest, the gains over those 50 years have been quite extraordinary, something like 25 points. The arithmetic subtest essentially tests arithmetical reasoning, and on that, the gains have been extremely small.
  • ...9 more annotations...
  • in 1900 in America, if you asked a child, what do dogs and rabbits have in common, they would say, “Well, you use dogs to hunt rabbits.” This is not the answer that the IQ tests want. They want you to classify. Today, a child would be likely to say, “They are both animals.” They picked up the habit of classification and use the vocabulary of science.
  • In 1910, schools were focused on kids memorizing things about the real world. Today, they are entirely about relationships.
  • One of the fundamental things is the switch from “utilitarian spectacles” to “scientific spectacles.” The fact that we wear scientific spectacles doesn’t mean that we actually know a lot about science.
  • Formal schooling is terribly important; it helps you think in the way that IQ testers like.
  • we have learned to use logic to attack the hypothetical. We have an ability to deal with a much wider range of problems than our ancestors would.
  • In 1950, teenagers could not only understand their parents, but they could also mimic their speech. Today, teenagers can still understand their parents. Their passive vocabularies are good enough. But when it comes to the words they actively use, they are much less capable of adult speak.
  • The brighter you are, the quicker after the age of 65 you have a downward curve for your analytic abilities
  • Retire from your job, but read great literature. Read about the history of science. Try and keep up your problem solving skills
  • One of the most interesting predictions is what will happen to the developing world. If they industrialize, in theory, they should have the explosive IQ gains in the coming century that we had in the last century.
Javier E

Denying Genetics Isn't Shutting Down Racism, It's Fueling It - 0 views

  • For many on the academic and journalistic left, genetics are deemed largely irrelevant when it comes to humans. Our large brains and the societies we have constructed with them, many argue, swamp almost all genetic influences.
  • Humans, in this view, are the only species on Earth largely unaffected by recent (or ancient) evolution, the only species where, for example, the natural division of labor between male and female has no salience at all, the only species, in fact, where natural variations are almost entirely social constructions, subject to reinvention.
  • if we assume genetics play no role, and base our policy prescriptions on something untrue, we are likely to overshoot and over-promise in social policy, and see our rhetoric on race become ever more extreme and divisive.
  • ...21 more annotations...
  • Reich simply points out that this utopian fiction is in danger of collapse because it is not true and because genetic research is increasingly proving it untrue.
  • “You will sometimes hear that any biological differences among populations are likely to be small, because humans have diverged too recently from common ancestors for substantial differences to have arisen under the pressure of natural selection. This is not true. The ancestors of East Asians, Europeans, West Africans and Australians were, until recently, almost completely isolated from one another for 40,000 years or longer, which is more than sufficient time for the forces of evolution to work.” Which means to say that the differences could be (and actually are) substantial.
  • If you don’t establish a reasonable forum for debate on this, Reich argues, if you don’t establish the principle is that we do not have to be afraid of any of this, it will be monopolized by truly unreasonable and indeed dangerous racists. And those racists will have the added prestige for their followers of revealing forbidden knowledge.
  • so there are two arguments against the suppression of this truth and the stigmatization of its defenders: that it’s intellectually dishonest and politically counterproductive.
  • Klein seems to back a truly extreme position: that only the environment affects IQ scores, and genes play no part in group differences in human intelligence. To this end, he cites the “Flynn effect,” which does indeed show that IQ levels have increased over the years, and are environmentally malleable up to a point. In other words, culture, politics, and economics do matter.
  • But Klein does not address the crucial point that even with increases in IQ across all races over time, the racial gap is still frustratingly persistent, that, past a certain level, IQ measurements have actually begun to fall in many developed nations, and that Flynn himself acknowledges that the effect does not account for other genetic influences on intelligence.
  • In an email exchange with me, in which I sought clarification, Klein stopped short of denying genetic influences altogether, but argued that, given rising levels of IQ, and given how brutal the history of racism against African-Americans has been, we should nonetheless assume “right now” that genes are irrelevant.
  • My own brilliant conclusion: Group differences in IQ are indeed explicable through both environmental and genetic factors and we don’t yet know quite what the balance is.
  • We are, in this worldview, alone on the planet, born as blank slates, to be written on solely by culture. All differences between men and women are a function of this social effect; as are all differences between the races. If, in the aggregate, any differences in outcome between groups emerge, it is entirely because of oppression, patriarchy, white supremacy, etc. And it is a matter of great urgency that we use whatever power we have to combat these inequalities.
  • Liberalism has never promised equality of outcomes, merely equality of rights. It’s a procedural political philosophy rooted in means, not a substantive one justified by achieving certain ends.
  • A more nuanced understanding of race, genetics, and environment would temper this polarization, and allow for more unifying, practical efforts to improve equality of opportunity, while never guaranteeing or expecting equality of outcomes.
  • In some ways, this is just a replay of the broader liberal-conservative argument. Leftists tend to believe that all inequality is created; liberals tend to believe we can constantly improve the world in every generation, forever perfecting our societies.
  • Rightists believe that human nature is utterly unchanging; conservatives tend to see the world as less plastic than liberals, and attempts to remake it wholesale dangerous and often counterproductive.
  • I think the genius of the West lies in having all these strands in our politics competing with one another.
  • Where I do draw the line is the attempt to smear legitimate conservative ideas and serious scientific arguments as the equivalent of peddling white supremacy and bigotry. And Klein actively contributes to that stigmatization and demonization. He calls the science of this “race science” as if it were some kind of illicit and illegitimate activity, rather than simply “science.”
  • He goes on to equate the work of these scientists with the “most ancient justification for bigotry and racial inequality.” He even uses racism to dismiss Murray and Harris: they are, after all, “two white men.
  • He still refuses to believe that Murray’s views on this are perfectly within the academic mainstream in studies of intelligence, as they were in 1994.
  • Klein cannot seem to hold the following two thoughts in his brain at the same time: that past racism and sexism are foul, disgusting, and have wrought enormous damage and pain and that unavoidable natural differences between races and genders can still exist.
  • , it matters that we establish a liberalism that is immune to such genetic revelations, that can strive for equality of opportunity, and can affirm the moral and civic equality of every human being on the planet.
  • We may even embrace racial discrimination, as in affirmative action, that fuels deeper divides. All of which, it seems to me, is happening — and actively hampering racial progress, as the left defines the most multiracial and multicultural society in human history as simply “white supremacy” unchanged since slavery; and as the right viscerally responds by embracing increasingly racist white identity politics.
  • liberalism is integral to our future as a free society — and it should not falsely be made contingent on something that can be empirically disproven. It must allow for the truth of genetics to be embraced, while drawing the firmest of lines against any moral or political abuse of it
Duncan H

What to Do About 'Coming Apart' - NYTimes.com - 0 views

  • Murray has produced a book-length argument placing responsibility for rising inequality and declining mobility on widespread decay in the moral fiber of white, lower-status, less well-educated Americans, putting relatively less emphasis on a similar social breakdown among low-status, less-educated Americans of all races
  • Murray’s strength lies in his ability to raise issues that center-left policy makers and academics prefer, for the most part, to shy away from. His research methods, his statistical analyses and the conclusions he draws are subject to passionate debate. But by forcing taboo issues into the public arena, Murray has opened up for discussion politically salient issues that lurk at a subterranean level in the back-brains of many voters, issues that are rarely examined with the rigor necessary to affirm or deny their legitimacy.
  • The National Review and the Conservative Monitor cited “Losing Ground” as one of the ten books that most changed America. Murray’s bookseemed like a bolt of lightning in the middle of the night revealing what should have been plain as the light of day. The welfare state so carefully built up in the 1960s and 1970s created a system of disincentives for people to better their own lives. By paying welfare mothers to have children out of wedlock into a poor home, more of these births were encouraged. By doling out dollars at a rate that could not be matched by the economy, the system encouraged the poor to stay home.
  • ...9 more annotations...
  • He contends in “Coming Apart” that there was far greater social cohesion across class lines 50 years ago because “the powerful norms of social and economic behavior in 1960 swept virtually everyone into their embrace,” adding in a Jan. 21 op-ed in the Wall Street Journal thatOver the past 50 years, that common civic culture has unraveled. We have developed a new upper class with advanced educations, often obtained at elite schools, sharing tastes and preferences that set them apart from mainstream America. At the same time, we have developed a new lower class, characterized not by poverty but by withdrawal from America’s core cultural institutions.According to Murray, higher education has now become a proxy for higher IQ, as elite colleges become sorting mechanisms for finding, training and introducing to each other the most intellectually gifted young people. Fifty years into the education revolution, members of this elite are likely to be themselves the offspring of cognitively gifted parents, and to ultimately bear cognitively gifted children.
  • “Industriousness: The norms for work and women were revolutionized after 1960, but the norm for men putatively has remained the same: Healthy men are supposed to work. In practice, though, that norm has eroded everywhere.”
  • Murray makes the case that cognitive ability is worth ever more in modern advanced, technologically complex hypercompetitive market economies. As an example, Murray quotes Bill Gates: “Software is an IQ business. Microsoft must win the IQ war or we won’t have a future.”
  • Murray alleges that those with higher IQs now exhibit personal and social behavioral choices in areas like marriage, industriousness, honesty and religiosity that allow them to enjoy secure and privileged lives. Whites in the lower social-economic strata are less cognitively able – in Murray’s view – and thus less well-equipped to resist the lure of the sexual revolution and doctrines of self-actualization so they succumb to higher rates of family dissolution, non-marital births, worklessness and criminality. This interaction between IQ and behavioral choice, in Murray’s framework, is what has led to the widening income and cultural gap.
  • Despised by the left, Murray has arguably done liberals a service by requiring them to deal with those whose values may seem alien, to examine the unintended consequences of their policies and to grapple with the political impact of assertions made by the right. He has also amassed substantial evidence to bolster his claims and at the same time elicited a formidable academic counter-attack.
  • To Murray, the overarching problem is that liberal elites, while themselves living lives of probity, have refused to proselytize for the bourgeois virtues to which they subscribe, thus leaving their less discerning fellow-citizens to flounder in the anti-bourgeois legacy of the counter-cultural 1960s.
  • “Great Civic Awakening” among the new upper class – an awakening that will lead to the kind of “moral rearmament” and paternalism characteristic of anti-poverty drives in the 19th century. To achieve this, Murray believes, the “new upper class must once again fall in love with what makes America different.”
  • The cognitive elites Murray cites are deeply committed to liberal norms of cultural tolerance and permissiveness. The antipathy to the moralism of the religious right has, in fact, been a major force driving these upscale, secular voters into the Democratic party.
  • changes in the world economy may be destructive in terms of the old social model, but they are profoundly liberating and benign in and of themselves. The family farm wasn’t dying because capitalism had failed or a Malthusian crisis was driving the world to starvation. The family farm died of abundance; it died of the rapidly rising productivity that meant that fewer and fewer people had to work to produce the food on which humanity depended.Mead continues:Revolutions in manufacturing and, above all, in communications and information technology create the potential for unprecedented abundance and a further liberation of humanity from meaningless and repetitive work. Our problem isn’t that the sources of prosperity have dried up in a long drought; our problem is that we don’t know how to swim. It is raining soup, and we are stuck holding a fork.The 21st century, Mead adds,must reinvent the American Dream. It must recast our economic, social, familial, educational and political systems for new challenges and new opportunities. Some hallowed practices and institutions will have to go under the bus. But in the end, the changes will make us richer, more free and more secure than we are now.Mead’s predictions may or may not prove prescient, but it his thinking, more than Murray’s, that reflects the underlying optimism that has sustained the United States for more than two centuries — a refusal to believe that anything about human nature is essentially “intractable.” Mead’s way of looking at things is not only more inviting than Murray’s, it is also more on target.
charlottedonoho

Smarter Every Year? Mystery of the Rising IQs - WSJ - 0 views

  • That’s because absolute performance on IQ tests—the actual number of questions people get right—has greatly improved over the last 100 years. It’s called the Flynn effect, after James Flynn, a social scientist at New Zealand’s University of Otago who first noticed it in the 1980s.
  • They found that the Flynn effect is real—and large. The absolute scores consistently improved for children and adults, for developed and developing countries. People scored about three points more every decade, so the average score is 30 points higher than it was 100 years ago.
  • The pace jumped in the 1920s and slowed down during World War II. The scores shot up again in the postwar boom and then slowed down again in the ’70s. They’re still rising, but even more slowly. Adult scores climbed more than children’s.
  • ...5 more annotations...
  • Genes couldn’t change that swiftly, but better nutrition and health probably played a role. Still, that can’t explain why the change affected adults’ scores more than children’s. Economic prosperity helped, too—IQ increases correlate significantly with higher gross domestic product.
  • The fact that more people go to school for longer likely played the most important role—more education also correlates with IQ increases. That could explain why adults, who have more schooling, benefited most.
  • The best explanation probably depends on some combination of factors. Dr. Flynn himself argues for a “social multiplier” theory. An initially small change can set off a benign circle that leads to big effects. Slightly better education, health, income or nutrition might make a child do better at school and appreciate learning more. That would motivate her to read more books and try to go to college, which would make her even smarter and more eager for education, and so on.
  • “Life history” is another promising theory. A longer period of childhood correlates with better learning abilities across many species.
  • The thing that really makes humans so smart, throughout our history, may be that we can invent new kinds of intelligence to suit our changing environments.
margogramiak

Low-income preschoolers exposed to nurturing care have with higher IQ scores later on -... - 0 views

  • Preschoolers living in impoverished communities who have access to a nurturing home environment have significantly higher intelligence quotient (IQ) scores in adolescence compared to those raised without nurturing care.
  • Preschoolers living in impoverished communities who have access to a nurturing home environment have significantly higher intelligence quotient (IQ) scores in adolescence compared to those raised without nurturing care.
    • margogramiak
       
      In class, we've talked about the effects of economic and emotional states growing up.
  • hey found that prenatal and early life adversities matter throughout life.
    • margogramiak
       
      Of course they do! How could they not? In Spanish, we learned about the "circle of poverty," which definitely applies here.
  • ...5 more annotations...
  • They also found that being raised in a nurturing environment could significantly counteract the detrimental effect of early adversities on IQ and help children achieve their full intellectual potential.
    • margogramiak
       
      I think "feeling loved" and "feeling like you are enough" is a hug contributor to success. If you are told you can do something, you are much more confident than if you are told you can't, obviously.
  • A nurturing environment also led to better growth and fewer psycho-social difficulties in adolescence, but it did not mitigate the effects of early adversities on growth and psycho-social difficulties."
    • margogramiak
       
      Interesting.
  • one in five children are raised in poverty and 15 percent do not complete high school, with higher rates for children in Black and Hispanic families.
    • margogramiak
       
      These are very impactful stats.
  • Parents want to provide nurturing environments and we need to help them." She said this includes interacting with young children in a positive way such as reading children's books from the library, singing songs together, and playing games with numbers and letters. Children who engage in age-appropriate chores with adult supervision like picking up toys and clearing the table gain skills and feel good about helping.
    • margogramiak
       
      This is up to the parents though, isn't it? How can the community solve the issue of lack of nurture in a household?
  • "This research highlights the importance of nurturing caregivers, both at home and at school to help children lead more productive lives as adults."
    • margogramiak
       
      It seems obvious nurturing has positive effects. I find it hard to believe that anyone who doesn't nurture their children who read this article and change the way they parent. I wish there was a way provided that allowed the community to help out, but I don't think this is a possibility.
Javier E

Survival Of The Highest « The Dish - 0 views

  • the Savanna-IQ Interaction H
  • this hypothesis predicts that individuals of higher intelligence are more likely to engage in novel behavior that goes against cultural traditions or social norms.
  • a forty-year-long study funded by the British government paralleled this hypothesis, and found that “very bright” individuals with IQs above 125 were about twice as likely to have tried psychoactive drugs than “very dull” individuals with IQs below 75. As Kanazawa explains, “Intelligent people don’t always do the ‘right’ thing, only the evolutionarily novel thing.”
Javier E

Study Shows Why Lawyers Are So Smart - WSJ.com - 2 views

  • The research team performed brain scans on 24 college students and recent graduates, both before and after they spent 100 hours studying for the LSAT over a three-month period. The researchers also scanned 23 young adults who didn't study for the test. For those who studied, the results showed increased connectivity between the frontal lobes of the brain, as well as between the frontal and parietal lobes, which are parts of the brain associated with reasoning and thinking.
  • The study focused on fluid reasoning—the ability to tackle a novel problem—which is a central part of IQ tests and can to some degree predict academic performance or ability in demanding careers.
  • "People assume that IQ tests measure some stable characteristic of an individual, but we think this whole assumption is flawed," said Silvia Bunge, the study's senior author. "We think the skills measured by an IQ test wax and wane over time depending on the individual's level of cognitive activity."
tornekm

Of bairns and brains | The Economist - 0 views

  • especially given the steep price at which it was bought. Humans’ outsized, power-hungry brains suck up around a quarter of their body’s oxygen supplies.
  • . It was simply humanity’s good fortune that those big sexy brains turned out to be useful for lots of other things, from thinking up agriculture to building internal-combustion engines. Another idea is that human cleverness arose out of the mental demands of living in groups whose members are sometimes allies and sometimes rivals.
  • human infants take a year to learn even to walk, and need constant supervision for many years afterwards. That helplessness is thought to be one consequence of intelligence—or, at least, of brain size.
  • ...6 more annotations...
  • ever-more incompetent infants, requiring ever-brighter parents to ensure they survive childhood.
  • The self-reinforcing nature of the process would explain why intelligence is so strikingly overdeveloped in humans compared even with chimpanzees.
  • developed first in primates, a newish branch of the mammals, a group that is itself relatively young.
  • found that babies born to mothers with higher IQs had a better chance of surviving than those born to low-IQ women, which bolsters the idea that looking after human babies is indeed cognitively taxing.
  • none of this adds up to definitive proof.
  • Any such feedback loop would be a slow process (at least as reckoned by the humans themselves), most of which would have taken place in the distant past.
Javier E

I.Q. Points for Sale, Cheap - NYTimes.com - 1 views

  • Until recently, the overwhelming consensus in psychology was that intelligence was essentially a fixed trait. But in 2008, an article by a group of researchers led by Susanne Jaeggi and Martin Buschkuehl challenged this view and renewed many psychologists’ enthusiasm about the possibility that intelligence was trainable — with precisely the kind of tasks that are now popular as games.
  • it’s important to explain why we’re not sold on the idea.
  • There have been many attempts to demonstrate large, lasting gains in intelligence through educational interventions, with few successes. When gains in intelligence have been achieved, they have been modest and the result of many years of effort.
  • ...3 more annotations...
  • Web site PsychFileDrawer.org, which was founded as an archive for failed replication attempts in psychological research, maintains a Top 20 list of studies that its users would like to see replicated. The Jaeggi study is currently No. 1.
  • Another reason for skepticism is a weakness in the Jaeggi study’s design: it included only a single test of reasoning to measure gains in intelligence.
  • Demonstrating that subjects are better on one reasoning test after cognitive training doesn’t establish that they’re smarter. It merely establishes that they’re better on one reasoning test.
Javier E

What's Wrong With the Teenage Mind? - WSJ.com - 1 views

  • What happens when children reach puberty earlier and adulthood later? The answer is: a good deal of teenage weirdness. Fortunately, developmental psychologists and neuroscientists are starting to explain the foundations of that weirdness.
  • The crucial new idea is that there are two different neural and psychological systems that interact to turn children into adults. Over the past two centuries, and even more over the past generation, the developmental timing of these two systems has changed. That, in turn, has profoundly changed adolescence and produced new kinds of adolescent woe. The big question for anyone who deals with young people today is how we can go about bringing these cogs of the teenage mind into sync once again
  • The first of these systems has to do with emotion and motivation. It is very closely linked to the biological and chemical changes of puberty and involves the areas of the brain that respond to rewards. This is the system that turns placid 10-year-olds into restless, exuberant, emotionally intense teenagers, desperate to attain every goal, fulfill every desire and experience every sensation. Later, it turns them back into relatively placid adults.
  • ...23 more annotations...
  • adolescents aren't reckless because they underestimate risks, but because they overestimate rewards—or, rather, find rewards more rewarding than adults do. The reward centers of the adolescent brain are much more active than those of either children or adults.
  • What teenagers want most of all are social rewards, especially the respect of their peers
  • Becoming an adult means leaving the world of your parents and starting to make your way toward the future that you will share with your peers. Puberty not only turns on the motivational and emotional system with new force, it also turns it away from the family and toward the world of equals.
  • The second crucial system in our brains has to do with control; it channels and harnesses all that seething energy. In particular, the prefrontal cortex reaches out to guide other parts of the brain, including the parts that govern motivation and emotion. This is the system that inhibits impulses and guides decision-making, that encourages long-term planning and delays gratification.
  • Today's adolescents develop an accelerator a long time before they can steer and brake.
  • Expertise comes with experience.
  • In gatherer-hunter and farming societies, childhood education involves formal and informal apprenticeship. Children have lots of chances to practice the skills that they need to accomplish their goals as adults, and so to become expert planners and actors.
  • In the past, to become a good gatherer or hunter, cook or caregiver, you would actually practice gathering, hunting, cooking and taking care of children all through middle childhood and early adolescence—tuning up just the prefrontal wiring you'd need as an adult. But you'd do all that under expert adult supervision and in the protected world of childhood
  • In contemporary life, the relationship between these two systems has changed dramatically. Puberty arrives earlier, and the motivational system kicks in earlier too. At the same time, contemporary children have very little experience with the kinds of tasks that they'll have to perform as grown-ups.
  • The experience of trying to achieve a real goal in real time in the real world is increasingly delayed, and the growth of the control system depends on just those experiences.
  • This control system depends much more on learning. It becomes increasingly effective throughout childhood and continues to develop during adolescence and adulthood, as we gain more experience.
  • An ever longer protected period of immaturity and dependence—a childhood that extends through college—means that young humans can learn more than ever before. There is strong evidence that IQ has increased dramatically as more children spend more time in school
  • children know more about more different subjects than they ever did in the days of apprenticeships.
  • Wide-ranging, flexible and broad learning, the kind we encourage in high-school and college, may actually be in tension with the ability to develop finely-honed, controlled, focused expertise in a particular skill, the kind of learning that once routinely took place in human societies.
  • this new explanation based on developmental timing elegantly accounts for the paradoxes of our particular crop of adolescents.
  • First, experience shapes the brain.
  • the brain is so powerful precisely because it is so sensitive to experience. It's as true to say that our experience of controlling our impulses make the prefrontal cortex develop as it is to say that prefrontal development makes us better at controlling our impulses
  • Second, development plays a crucial role in explaining human nature
  • there is more and more evidence that genes are just the first step in complex developmental sequences, cascades of interactions between organism and environment, and that those developmental processes shape the adult brain. Even small changes in developmental timing can lead to big changes in who we become.
  • Brain research is often taken to mean that adolescents are really just defective adults—grown-ups with a missing part.
  • But the new view of the adolescent brain isn't that the prefrontal lobes just fail to show up; it's that they aren't properly instructed and exercised
  • Instead of simply giving adolescents more and more school experiences—those extra hours of after-school classes and homework—we could try to arrange more opportunities for apprenticeship
  • Summer enrichment activities like camp and travel, now so common for children whose parents have means, might be usefully alternated with summer jobs, with real responsibilities.
  •  
    The two brain systems, the increasing gap between them, and the implications for adolescent education.
Javier E

It's Not Just About Bad Choices - NYTimes.com - 0 views

  • WHENEVER I write about people who are struggling, I hear from readers who say something like: Folks need to stop whining and get a job. It’s all about personal responsibility.
  • In a 2014 poll, Republicans were twice as likely to say that people are poor because of individual failings as to say the reason is lack of opportunity (Democrats thought the opposite). I decided to ask some of the poor w
  • Too often, I believe, liberals deny that poverty is linked to bad choices. As Phillips and many other poor people acknowledge, of course, it is.
  • ...11 more annotations...
  • Self-destructive behaviors — dropping out of school, joining a gang, taking drugs, bearing children when one isn’t ready — compound poverty.
  • Yet scholars are also learning to understand the roots of these behaviors, and they’re far more complicated than the conservative narrative of human weakness.
  • For starters, there is growing evidence that poverty and mental health problems are linked in complex, reinforcing ways
  • If you’re battling mental health problems, or grow up with traumas like domestic violence (or seeing your brother shot dead), you’re more likely to have trouble in school, to self-medicate with drugs or alcohol, to have trouble in relationships.
  • A second line of research has shown that economic stress robs us of cognitive bandwidth.
  • Worrying about bills, food or other problems, leaves less capacity to think ahead or to exert self-discipline. So, poverty imposes a mental tax.
  • It turns out that when people have elevated levels of cortisol, a stress hormone, they are less willing to delay gratification.
  • it’s circumstances that can land you in a situation where it’s really hard to make a good decision because you’re so stressed out. And the ones you get wrong matter much more, because there’s less slack to play with.”
  • That emphasis on personal responsibility is part of the 12-step program to confront alcoholism or drug addiction, and it may be useful for people like Jackson. But for society to place the blame entirely on the individual seems to me a cop-out.
  • Let’s also remember, though, that today we have randomized trials — the gold standard of evidence — showing that certain social programs make self-destructive behaviors less common.
  • as long as we’re talking about personal irresponsibility, let’s also examine our own. Don’t we have a collective responsibility to provide more of a fair start in life to all, so that children aren’t propelled toward bad choices?
Javier E

History News Network | Are You a Genius? - 0 views

  • the real question is not so much ‘What is genius?’ or even ‘Who is a genius?’ but rather, ‘What stake do we have in the whole idea of genius?’ or even, ‘Who’s asking and what’s behind their question?’
  • These are the issues I address in my new book by looking at the different views and theories of genius over the course of three centuries, from the start of the eighteenth century to the present day.
  • I concentrated on France, partly because French literature and intellectual history happen to be my area of expertise and personal interest; partly because the French contribution to the literature on genius hasn’t received its due; but mostly because the variety and the inventiveness of the views and theories of genius in France was a story worth telling for itself
  • ...4 more annotations...
  • For me it’s this literature, more than the phenomenon itself, which makes genius a topic worth paying attention to. And the more you read, the less likely you are to be able to come up with any definition of what genius might be.
  • For eighteenth-century commentators, genius was self-evident: you knew it when you saw it, but for the nineteenth-century Romantics, genius was essentially misunderstood, and only genius itself was capable of recognizing its own kind
  • After the French Revolution, the question of national genius (another sense of the word, deriving from the genius loci) was subject to particularly anxious or over-assertive scrutiny. A number of nineteenth-century novels allowed for a rare feminine role in genius, but almost always doomed genius to failure. The medical profession turned the genius into a madman, while the experimental psychologists at the end of the century devised the IQ test which made genius nothing more than a high point on a continuous scale of intelligence. Child prodigies were the stuff of children’s literature but real examples in the twentieth century generated skepticism about the whole notion of genius, until Julia Kristeva came along and rehabilitated genius as essentially feminine, and Jacques Derrida embraced imposture as its essential quality
  • What all this indicates is that the idea of genius is curiously labile, that it changes shape, definition and value according to the way it’s talked about, but also that there’s something about the idea that, as Claude Lévi-Strauss said about animals, makes it ‘good to think with.’        
maddieireland334

How Smart Should the President Be? - 0 views

  •  
    Do the smartest presidents make the best presidents? This question invariably emerges as a topic of spirited debate when the U.S. presidential election approaches. In 2004, former New York Times Executive Editor Howell Raines asked, "Does anyone in America doubt that Kerry has a higher IQ than Bush?"
Javier E

How To Look Smart, Ctd - The Daily Dish | By Andrew Sullivan - 0 views

  • The Atlantic Home todaysDate();Tuesday, February 8, 2011Tuesday, February 8, 2011 Go Follow the Atlantic » Politics Presented by When Ronald Reagan Endorsed Ron Paul Joshua Green Epitaph for the DLC Marc Ambinder A Hard Time Raising Concerns About Egypt Chris Good Business Presented by Could a Hybrid Mortgage System Work? Daniel Indiviglio Fighting Bias in Academia Megan McArdle The Tech Revolution For Seniors Derek Thompson Culture Presented By 'Tiger Mother' Creates a New World Order James Fallows Justin Bieber: Daydream Believer James Parker <!-- /li
  • these questions tend to overlook the way IQ tests are designed. As a neuropsychologist who has administered hundreds of these measures, I can tell you that their structures reflect a deeply embedded bias toward intelligence as a function of reading skills
Javier E

Raymond Tallis Takes Out the 'Neurotrash' - The Chronicle Review - The Chronicle of Hig... - 0 views

  • Tallis informs 60 people gathered in a Kent lecture hall that his talk will demolish two "pillars of unwisdom." The first, "neuromania," is the notion that to understand people you must peer into the "intracranial darkness" of their skulls with brain-scanning technology. The second, "Darwinitis," is the idea that Charles Darwin's evolutionary theory can explain not just the origin of the human species—a claim Tallis enthusiastically accepts—but also the nature of human behavior and institutions.
  • Aping Mankind argues that neuroscientific approaches to things like love, wisdom, and beauty are flawed because you can't reduce the mind to brain activity alone.
  • Stephen Cave, a Berlin-based philosopher and writer who has called Aping Mankind "an important work," points out that most philosophers and scientists do in fact believe "that mind is just the product of certain brain activity, even if we do not currently know quite how." Tallis "does both the reader and these thinkers an injustice" by declaring that view "obviously" wrong,
  • ...5 more annotations...
  • Geraint Rees, director of University College London's Institute of Cognitive Neuroscience, complains that reading Tallis is "a bit like trying to nail jelly to the wall." He "rubbishes every current theory of the relationship between mind and brain, whether philosophical or neuroscientific," while offering "little or no alternative,"
  • cultural memes. The Darwinesque concept originates in Dawkins's 1976 book, The Selfish Gene. Memes are analogous to genes, Dennett has said, "replicating units of culture" that spread from mind to mind like a virus. Religion, chess, songs, clothing, tolerance for free speech—all have been described as memes. Tallis considers it absurd to talk of a noun-phrase like "tolerance for free speech" as a discrete entity. But Dennett argues that Tallis's objections are based on "a simplistic idea of what one might mean by a unit." Memes aren't units? Well, in that spirit, says Dennett, organisms aren't units of biology, nor are species—they're too complex, with too much variation. "He's got to allow theory to talk about entities which are not simple building blocks," Dennett says.
  • How is it that he perceives the glass of water on the table? How is it that he feels a sense of self over time? How is it that he can remember a patient he saw in 1973, and then cast his mind forward to his impending visit to the zoo? There are serious problems with trying to reduce such things to impulses in the brain, he argues. We can explain "how the light gets in," he says, but not "how the gaze looks out." And isn't it astonishing, he adds, that much neural activity seems to have no link to consciousness? Instead, it's associated with things like controlling automatic movements and regulating blood pressure. Sure, we need the brain for consciousness: "Chop my head off, and my IQ descends." But it's not the whole story. There is more to perceptions, memories, and beliefs than neural impulses can explain. The human sphere encompasses a "community of minds," Tallis has written, "woven out of a trillion cognitive handshakes of shared attention, within which our freedom operates and our narrated lives are led." Those views on perception and memory anchor his attack on "neurobollocks." Because if you can't get the basics right, he says, then it's premature to look to neuroscience for clues to complex things like love.
  • Yes, many unanswered questions persist. But these are early days, and neuroscience remains immature, says Churchland, a professor emerita of philosophy at University of California at San Diego and author of the subfield-spawning 1986 book Neurophilosophy. In the 19th century, she points out, people thought we'd never understand light. "Well, by gosh," she says, "by the time the 20th century rolls around, it turns out that light is electromagnetic radiation. ... So the fact that at a certain point in time something seems like a smooth-walled mystery that we can't get a grip on, doesn't tell us anything about whether some real smart graduate student is going to sort it out in the next 10 years or not."
  • Dennett claims he's got much of it sorted out already. He wrote a landmark book on the topic in 1991, Consciousness Explained. (The title "should have landed him in court, charged with breach of the Trade Descriptions Act," writes Tallis.) Dennett uses the vocabulary of computer science to explain how consciousness emerges from the huge volume of things happening in the brain all at once. We're not aware of everything, he tells me, only a "limited window." He describes that stream of consciousness as "the activities of a virtual machine which is running on the parallel hardware of the brain." "You—the fruits of all your experience, not just your genetic background, but everything you've learned and done and all your memories—what ties those all together? What makes a self?" Dennett asks. "The answer is, and has to be, the self is like a software program that organizes the activities of the brain."
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
anonymous

Smarter brains run on sparsely connected neurons -- ScienceDaily - 0 views

  • The more intelligent a person, the fewer connections there are between the neurons in his cerebral cortex.
  • Subsequently, the researchers associated the gathered data with each other and found out: the more intelligent a person, the fewer dendrites there are in their cerebral cortex.
  • For one, it had been previously ascertained that intelligent people tend to have larger brains. "The assumption has been that larger brains contain more neurons and, consequently, possess more computational power," says Erhan Genç. However, other studies had shown that -- despite their comparatively high number of neurons -- the brains of intelligent people demonstrated less neuronal activity during an IQ test than the brains of less intelligent individuals.
1 - 20 of 28 Next ›
Showing 20 items per page