Skip to main content

Home/ TOK Friends/ Group items tagged fluid

Rss Feed Group items tagged

Javier E

Why the very concept of 'general knowledge' is under attack | Times2 | The Times - 0 views

  • why has University Challenge lasted, virtually unchanged, for so long?
  • The answer may lie in a famous theory about our brains put forward by the psychologist Raymond Cattell in 1963
  • Cattell divided intelligence into two categories: fluid and crystallised. Fluid intelligence refers to basic reasoning and other mental activities that require minimal learning — just an alert and flexible brain.
  • ...12 more annotations...
  • By contrast, crystallised intelligence is based on experience and the accumulation of knowledge. Fluid intelligence peaks at the age of about 20 then gradually declines, whereas crystallised intelligence grows through your life until you hit your mid-sixties, when you start forgetting things.
  • that explains much about University Challenge’s appeal. Because the contestants are mostly aged around 20 and very clever, their fluid intelligence is off the scale
  • On the other hand, because they have had only 20 years to acquire crystallised intelligence, their store of general knowledge is likely to be lacking in some areas.
  • In each episode there will be questions that older viewers can answer, thanks to their greater store of crystallised intelligence, but the students cannot. Therefore we viewers don’t feel inferior when confronted by these smart young people. On the contrary: we feel, in some areas, slightly superior.
  • The first comprises the deconstructionists and decolonialists
  • It’s a brilliantly balanced format
  • They argue that all knowledge is contextual and that things taken for granted in the past — for instance, a canon of great authors that everyone should read at school — merely reflect an outdated, usually Eurocentric view of what’s intellectually important.
  • there is a real threat to the future of University Challenge and much else of value in our society, and it is this. The very concept of “general knowledge” — of a widely accepted core of information that educated, inquisitive people should have in their memory banks — is under attack from two different groups.
  • The other group is the technocrats who argue that the extent of human knowledge is now so vast that it’s impossible for any individual to know more than, perhaps, one billionth of it
  • So why not leave it entirely to computers to do the heavy lifting of knowledge storing and recall, thus freeing our minds for creativity and problem solving?
  • The problem with the agitators on both sides of today’s culture wars is that they are forcefully trying to shape what’s accepted as general knowledge according to a blatant political agenda.
  • And the problem with relying on, say, Wikipedia’s 6.5 million English-language articles to store general knowledge for all of us? It’s the tacit implication that “mere facts” are too tedious to be clogging up our brains. From there it’s a short step to saying that facts don’t matter at all, that everything should be decided by “feelings”. And from there it’s an even shorter step to fake news and pernicious conspiracy theories, the belittling of experts and hard evidence, the closing of minds, the thickening of prejudice and the trivialisation of the national conversation.
Javier E

The Navy's USS Gabrielle Giffords and the Future of Work - The Atlantic - 0 views

  • Minimal manning—and with it, the replacement of specialized workers with problem-solving generalists—isn’t a particularly nautical concept. Indeed, it will sound familiar to anyone in an organization who’s been asked to “do more with less”—which, these days, seems to be just about everyone.
  • Ten years from now, the Deloitte consultant Erica Volini projects, 70 to 90 percent of workers will be in so-called hybrid jobs or superjobs—that is, positions combining tasks once performed by people in two or more traditional roles.
  • If you ask Laszlo Bock, Google’s former culture chief and now the head of the HR start-up Humu, what he looks for in a new hire, he’ll tell you “mental agility.
  • ...40 more annotations...
  • “What companies are looking for,” says Mary Jo King, the president of the National Résumé Writers’ Association, “is someone who can be all, do all, and pivot on a dime to solve any problem.”
  • The phenomenon is sped by automation, which usurps routine tasks, leaving employees to handle the nonroutine and unanticipated—and the continued advance of which throws the skills employers value into flux
  • Or, for that matter, on the relevance of the question What do you want to be when you grow up?
  • By 2020, a 2016 World Economic Forum report predicted, “more than one-third of the desired core skill sets of most occupations” will not have been seen as crucial to the job when the report was published
  • I asked John Sullivan, a prominent Silicon Valley talent adviser, why should anyone take the time to master anything at all? “You shouldn’t!” he replied.
  • Minimal manning—and the evolution of the economy more generally—requires a different kind of worker, with not only different acquired skills but different inherent abilities
  • It has implications for the nature and utility of a college education, for the path of careers, for inequality and employability—even for the generational divide.
  • Then, in 2001, Donald Rumsfeld arrived at the Pentagon. The new secretary of defense carried with him a briefcase full of ideas from the corporate world: downsizing, reengineering, “transformational” technologies. Almost immediately, what had been an experimental concept became an article of faith
  • But once cadets got into actual command environments, which tend to be fluid and full of surprises, a different picture emerged. “Psychological hardiness”—a construct that includes, among other things, a willingness to explore “multiple possible response alternatives,” a tendency to “see all experience as interesting and meaningful,” and a strong sense of self-confidence—was a better predictor of leadership ability in officers after three years in the field.
  • Because there really is no such thing as multitasking—just a rapid switching of attention—I began to feel overstrained, put upon, and finally irked by the impossible set of concurrent demands. Shouldn’t someone be giving me a hand here? This, Hambrick explained, meant I was hitting the limits of working memory—basically, raw processing power—which is an important aspect of “fluid intelligence” and peaks in your early 20s. This is distinct from “crystallized intelligence”—the accumulated facts and know-how on your hard drive—which peaks in your 50
  • Others noticed the change but continued to devote equal attention to all four tasks. Their scores fell. This group, Hambrick found, was high in “conscientiousness”—a trait that’s normally an overwhelming predictor of positive job performance. We like conscientious people because they can be trusted to show up early, double-check the math, fill the gap in the presentation, and return your car gassed up even though the tank was nowhere near empty to begin with. What struck Hambrick as counterintuitive and interesting was that conscientiousness here seemed to correlate with poor performance.
  • he discovered another correlation in his test: The people who did best tended to score high on “openness to new experience”—a personality trait that is normally not a major job-performance predictor and that, in certain contexts, roughly translates to “distractibility.”
  • To borrow the management expert Peter Drucker’s formulation, people with this trait are less focused on doing things right, and more likely to wonder whether they’re doing the right things.
  • High in fluid intelligence, low in experience, not terribly conscientious, open to potential distraction—this is not the classic profile of a winning job candidate. But what if it is the profile of the winning job candidate of the future?
  • One concerns “grit”—a mind-set, much vaunted these days in educational and professional circles, that allows people to commit tenaciously to doing one thing well
  • These ideas are inherently appealing; they suggest that dedication can be more important than raw talent, that the dogged and conscientious will be rewarded in the end.
  • he studied West Point students and graduates.
  • Traditional measures such as SAT scores and high-school class rank “predicted leader performance in the stable, highly regulated environment of West Point” itself.
  • It would be supremely ironic if the advance of the knowledge economy had the effect of devaluing knowledge. But that’s what I heard, recurrentl
  • “Fluid, learning-intensive environments are going to require different traits than classical business environments,” I was told by Frida Polli, a co-founder of an AI-powered hiring platform called Pymetrics. “And they’re going to be things like ability to learn quickly from mistakes, use of trial and error, and comfort with ambiguity.”
  • “We’re starting to see a big shift,” says Guy Halfteck, a people-analytics expert. “Employers are looking less at what you know and more and more at your hidden potential” to learn new things
  • advice to employers? Stop hiring people based on their work experience. Because in these environments, expertise can become an obstacle.
  • “The Curse of Expertise.” The more we invest in building and embellishing a system of knowledge, they found, the more averse we become to unbuilding it.
  • All too often experts, like the mechanic in LePine’s garage, fail to inspect their knowledge structure for signs of decay. “It just didn’t occur to him,” LePine said, “that he was repeating the same mistake over and over.
  • The devaluation of expertise opens up ample room for different sorts of mistakes—and sometimes creates a kind of helplessness.
  • Aboard littoral combat ships, the crew lacks the expertise to carry out some important tasks, and instead has to rely on civilian help
  • Meanwhile, the modular “plug and fight” configuration was not panning out as hoped. Converting a ship from sub-hunter to minesweeper or minesweeper to surface combatant, it turned out, was a logistical nightmare
  • So in 2016 the concept of interchangeability was scuttled for a “one ship, one mission” approach, in which the extra 20-plus sailors became permanent crew members
  • “As equipment breaks, [sailors] are required to fix it without any training,” a Defense Department Test and Evaluation employee told Congress. “Those are not my words. Those are the words of the sailors who were doing the best they could to try to accomplish the missions we gave them in testing.”
  • These results were, perhaps, predictable given the Navy’s initial, full-throttle approach to minimal manning—and are an object lesson on the dangers of embracing any radical concept without thinking hard enough about the downsides
  • a world in which mental agility and raw cognitive speed eclipse hard-won expertise is a world of greater exclusion: of older workers, slower learners, and the less socially adept.
  • if you keep going down this road, you end up with one really expensive ship with just a few people on it who are geniuses … That’s not a future we want to see, because you need a large enough crew to conduct multiple tasks in combat.
  • hat does all this mean for those of us in the workforce, and those of us planning to enter it? It would be wrong to say that the 10,000-hours-of-deliberate-practice idea doesn’t hold up at all. In some situations, it clearly does
  • A spinal surgery will not be performed by a brilliant dermatologist. A criminal-defense team will not be headed by a tax attorney. And in tech, the demand for specialized skills will continue to reward expertise handsomely.
  • But in many fields, the path to success isn’t so clear. The rules keep changing, which means that highly focused practice has a much lower return
  • In uncertain environments, Hambrick told me, “specialization is no longer the coin of the realm.”
  • It leaves us with lifelong learning,
  • I found myself the target of career suggestions. “You need to be a video guy, an audio guy!” the Silicon Valley talent adviser John Sullivan told me, alluding to the demise of print media
  • I found the prospect of starting over just plain exhausting. Building a professional identity takes a lot of resources—money, time, energy. After it’s built, we expect to reap gains from our investment, and—let’s be honest—even do a bit of coasting. Are we equipped to continually return to apprentice mode? Will this burn us out?
  • Everybody I met on the Giffords seemed to share that mentality. They regarded every minute on board—even during a routine transit back to port in San Diego Harbor—as a chance to learn something new.
sanderk

The 'Availability Bias' Is Driving Investor Decisions - Business Insider - 0 views

  • What availability bias tells us is that investors’ lingering perceptions of a dire market environment may be causing them to view investment opportunities through an overly negative lens, making it less appealing to consider taking on investment risk, no matter how small the returns on perceived “safe” investments.
  • “Imagine if I was a financial advisor and you came to talk to me about your risk attitude, and I started the discussion by asking you to describe how you felt in the last three years on the days when your portfolio lost 5% of its value. Then I asked you what your risk attitude was. Most people would say they don’t want to ever experience days like that again. On the other hand, what if instead I talked about people I knew who were retired and living in the Bahamas, fishing and golfing. Now your risk attitude would probably be different.”
  • As humans, our thinking is strongly influenced by what is personally most relevant, recent or dramatic.
  • ...4 more annotations...
  • lingering perceptions based on dramatic, painful events are impacting decision-making even when those events are over.
  • Ariely said a home country investment bias might be generated by two perceptual factors.“The first is an overly optimistic belief about one’s own economy; an expectation of performance in their country that is higher than what would be statistically realistic. The second reason is most likely due to procedural difficulties in investing outside the country – such as less knowledge about how to access these markets.”
  • investors may be making decisions driven more by personal bias or irrational belief than by reality and, in doing so, they may be hindering their own investment success.
  • The problem? These decisions may hinder their ability to reach their desired retirement or savings goals. The choice is between changing the goal—or changing the means of reaching it.
sanderk

Why people believe the Earth is flat and we should listen to anti-vaxxers | Elfy Scott ... - 0 views

  • I understand why scientifically minded people experience profound frustration at the nonsense, particularly when we’re forced to consider the public health implications of the anti-vaxxer movement which has been blamed as the root cause for recent outbreaks of measles in the US, a viral infection which can prove devastating for babies and young children. Misinformation can cause immense suffering and we should do our utmost to dispel the lies.
  • Too many people in scientific spheres seem to revel in dismissing flat-Earthers and anti-vaxxers as garden variety nut-jobs and losers. It may be cathartic – but it’s not productive.
  • It’s interesting that for a scientific community so perennially pleased with itself, we all seem to be making the same fundamental attribution error by ignoring the notion that belief in pseudoscience and conspiracy theories is propelled by external pressures of fear, confusion and disempowerment. Instead we seem too often satisfied with pinning the nonsense on some bizarrely flourishing individual idiocy.
  • ...1 more annotation...
  • When we feel so fundamentally disenfranchised, it’s comforting to concoct a fictional universe that systemically denies you the right cards. It gives you something to fight against and makes you self-deterministic. It provides an “us and them” narrative that allows you to conceive of yourself as a little David raging against a rather haughty, intellectual establishment Goliath. This is what worries me about journalists writing columns or tweets sneering at the supposed stupidity of the pseudoscientists and con spiracy theorists – it only serves to enforce this “us and them” worldview.
Emily Freilich

The Man Who Would Teach Machines to Think - James Somers - The Atlantic - 1 views

  • Douglas Hofstadter, the Pulitzer Prize–winning author of Gödel, Escher, Bach, thinks we've lost sight of what artificial intelligence really means. His stubborn quest to replicate the human mind.
  • “If somebody meant by artificial intelligence the attempt to understand the mind, or to create something human-like, they might say—maybe they wouldn’t go this far—but they might say this is some of the only good work that’s ever been done
  • Their operating premise is simple: the mind is a very unusual piece of software, and the best way to understand how a piece of software works is to write it yourself.
  • ...43 more annotations...
  • “It depends on what you mean by artificial intelligence.”
  • Computers are flexible enough to model the strange evolved convolutions of our thought, and yet responsive only to precise instructions. So if the endeavor succeeds, it will be a double victory: we will finally come to know the exact mechanics of our selves—and we’ll have made intelligent machines.
  • Ever since he was about 14, when he found out that his youngest sister, Molly, couldn’t understand language, because she “had something deeply wrong with her brain” (her neurological condition probably dated from birth, and was never diagnosed), he had been quietly obsessed by the relation of mind to matter.
  • How could consciousness be physical? How could a few pounds of gray gelatin give rise to our very thoughts and selves?
  • Consciousness, Hofstadter wanted to say, emerged via just the same kind of “level-crossing feedback loop.”
  • In 1931, the Austrian-born logician Kurt Gödel had famously shown how a mathematical system could make statements not just about numbers but about the system itself.
  • But then AI changed, and Hofstadter didn’t change with it, and for that he all but disappeared.
  • By the early 1980s, the pressure was great enough that AI, which had begun as an endeavor to answer yes to Alan Turing’s famous question, “Can machines think?,” started to mature—or mutate, depending on your point of view—into a subfield of software engineering, driven by applications.
  • Take Deep Blue, the IBM supercomputer that bested the chess grandmaster Garry Kasparov. Deep Blue won by brute force.
  • Hofstadter wanted to ask: Why conquer a task if there’s no insight to be had from the victory? “Okay,” he says, “Deep Blue plays very good chess—so what? Does that tell you something about how we play chess? No. Does it tell you about how Kasparov envisions, understands a chessboard?”
  • AI started working when it ditched humans as a model, because it ditched them. That’s the thrust of the analogy: Airplanes don’t flap their wings; why should computers think?
  • It’s a compelling point. But it loses some bite when you consider what we want: a Google that knows, in the way a human would know, what you really mean when you search for something
  • Cognition is recognition,” he likes to say. He describes “seeing as” as the essential cognitive act: you see some lines a
  • How do you make a search engine that understands if you don’t know how you understand?
  • s “an A,” you see a hunk of wood as “a table,” you see a meeting as “an emperor-has-no-clothes situation” and a friend’s pouting as “sour grapes”
  • That’s what it means to understand. But how does understanding work?
  • analogy is “the fuel and fire of thinking,” the bread and butter of our daily mental lives.
  • there’s an analogy, a mental leap so stunningly complex that it’s a computational miracle: somehow your brain is able to strip any remark of the irrelevant surface details and extract its gist, its “skeletal essence,” and retrieve, from your own repertoire of ideas and experiences, the story or remark that best relates.
  • in Hofstadter’s telling, the story goes like this: when everybody else in AI started building products, he and his team, as his friend, the philosopher Daniel Dennett, wrote, “patiently, systematically, brilliantly,” way out of the light of day, chipped away at the real problem. “Very few people are interested in how human intelligence works,”
  • For more than 30 years, Hofstadter has worked as a professor at Indiana University at Bloomington
  • The quick unconscious chaos of a mind can be slowed down on the computer, or rewound, paused, even edited
  • project out of IBM called Candide. The idea behind Candide, a machine-translation system, was to start by admitting that the rules-based approach requires too deep an understanding of how language is produced; how semantics, syntax, and morphology work; and how words commingle in sentences and combine into paragraphs—to say nothing of understanding the ideas for which those words are merely conduits.
  • , Hofstadter directs the Fluid Analogies Research Group, affectionately known as FARG.
  • Parts of a program can be selectively isolated to see how it functions without them; parameters can be changed to see how performance improves or degrades. When the computer surprises you—whether by being especially creative or especially dim-witted—you can see exactly why.
  • When you read Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought, which describes in detail this architecture and the logic and mechanics of the programs that use it, you wonder whether maybe Hofstadter got famous for the wrong book.
  • ut very few people, even admirers of GEB, know about the book or the programs it describes. And maybe that’s because FARG’s programs are almost ostentatiously impractical. Because they operate in tiny, seemingly childish “microdomains.” Because there is no task they perform better than a human.
  • “The entire effort of artificial intelligence is essentially a fight against computers’ rigidity.”
  • “Nobody is a very reliable guide concerning activities in their mind that are, by definition, subconscious,” he once wrote. “This is what makes vast collections of errors so important. In an isolated error, the mechanisms involved yield only slight traces of themselves; however, in a large collection, vast numbers of such slight traces exist, collectively adding up to strong evidence for (and against) particular mechanisms.
  • So IBM threw that approach out the window. What the developers did instead was brilliant, but so straightforward,
  • The technique is called “machine learning.” The goal is to make a device that takes an English sentence as input and spits out a French sentence
  • What you do is feed the machine English sentences whose French translations you already know. (Candide, for example, used 2.2 million pairs of sentences, mostly from the bilingual proceedings of Canadian parliamentary debates.)
  • By repeating this process with millions of pairs of sentences, you will gradually calibrate your machine, to the point where you’ll be able to enter a sentence whose translation you don’t know and get a reasonable resul
  • Google Translate team can be made up of people who don’t speak most of the languages their application translates. “It’s a bang-for-your-buck argument,” Estelle says. “You probably want to hire more engineers instead” of native speakers.
  • But the need to serve 1 billion customers has a way of forcing the company to trade understanding for expediency. You don’t have to push Google Translate very far to see the compromises its developers have made for coverage, and speed, and ease of engineering. Although Google Translate captures, in its way, the products of human intelligence, it isn’t intelligent itself.
  • “Did we sit down when we built Watson and try to model human cognition?” Dave Ferrucci, who led the Watson team at IBM, pauses for emphasis. “Absolutely not. We just tried to create a machine that could win at Jeopardy.”
  • For Ferrucci, the definition of intelligence is simple: it’s what a program can do. Deep Blue was intelligent because it could beat Garry Kasparov at chess. Watson was intelligent because it could beat Ken Jennings at Jeopardy.
  • “There’s a limited number of things you can do as an individual, and I think when you dedicate your life to something, you’ve got to ask yourself the question: To what end? And I think at some point I asked myself that question, and what it came out to was, I’m fascinated by how the human mind works, it would be fantastic to understand cognition, I love to read books on it, I love to get a grip on it”—he called Hofstadter’s work inspiring—“but where am I going to go with it? Really what I want to do is build computer systems that do something.
  • Peter Norvig, one of Google’s directors of research, echoes Ferrucci almost exactly. “I thought he was tackling a really hard problem,” he told me about Hofstadter’s work. “And I guess I wanted to do an easier problem.”
  • Of course, the folly of being above the fray is that you’re also not a part of it
  • As our machines get faster and ingest more data, we allow ourselves to be dumber. Instead of wrestling with our hardest problems in earnest, we can just plug in billions of examples of them.
  • Hofstadter hasn’t been to an artificial-intelligence conference in 30 years. “There’s no communication between me and these people,” he says of his AI peers. “None. Zero. I don’t want to talk to colleagues that I find very, very intransigent and hard to convince of anything
  • Everything from plate tectonics to evolution—all those ideas, someone had to fight for them, because people didn’t agree with those ideas.
  • Academia is not an environment where you just sit in your bath and have ideas and expect everyone to run around getting excited. It’s possible that in 50 years’ time we’ll say, ‘We really should have listened more to Doug Hofstadter.’ But it’s incumbent on every scientist to at least think about what is needed to get people to understand the ideas.”
Javier E

The Older Mind May Just Be a Fuller Mind - NYTimes.com - 0 views

  • Memory’s speed and accuracy begin to slip around age 25 and keep on slipping.
  • Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing
  • Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.
  • ...6 more annotations...
  • Neuroscientists have some reason to believe that neural processing speed, like many reflexes, slows over the years; anatomical studies suggest that the brain also undergoes subtle structural changes that could affect memory.
  • doubts about the average extent of the decline are rooted not in individual differences but in study methodology. Many studies comparing older and younger people, for instance, did not take into account the effects of pre-symptomatic Alzheimer’s disease,
  • The new data-mining analysis also raises questions about many of the measures scientists use. Dr. Ramscar and his colleagues applied leading learning models to an estimated pool of words and phrases that an educated 70-year-old would have seen, and another pool suitable for an educated 20-year-old. Their model accounted for more than 75 percent of the difference in scores between older and younger adults on items in a paired-associate test
  • That is to say, the larger the library you have in your head, the longer it usually takes to find a particular word (or pair).
  • Scientists who study thinking and memory often make a broad distinction between “fluid” and “crystallized” intelligence. The former includes short-term memory, like holding a phone number in mind, analytical reasoning, and the ability to tune out distractions, like ambient conversation. The latter is accumulated knowledge, vocabulary and expertise.
  • an increase in crystallized intelligence can account for a decrease in fluid intelligence,
Javier E

Bones discovered in an island cave may be an early human species - The Washington Post - 0 views

  • Piper, Mijares and their team published a description of the foot bone in 2010. They knew it was the oldest human remain in the Philippines, dated to 67,000 years ago, based on the amount of the radioactive element uranium in the fossil
  • Mijares returned to Callao Cave and uncovered more remains in 2011 and 2015. All told, the scientists pulled a dozen fossilized parts from the cave — teeth, a thigh bone, finger bones and foot bones, representing three individuals. Attempts to extract DNA from the remains were unsuccessful.
  • The body parts are diminutive, suggesting Homo luzonensis grew no more than four feet tall. Its molars have modern shapes. The way its leg muscle attached to its thigh bone is “distinctively human,”
  • ...12 more annotations...
  • The bones in its hands and feet are curved, “spitting images” of the toes and finger bones that belonged to the ancient Australopithecus, Piper said. These hominids, such as the 3 million-year-old Australopithecus afarensis Lucy, had digits well-suited for climbing.
  • This species lived at the same time as humans with modern anatomy, who first appeared in the fossil record 200,000 years ago (or perhaps as long as 350,000 years ago). ″We continue to realize that few thousands of years back in time, H. sapiens was definitely not alone on Earth,”
  • Though these fossils are the oldest in the Philippines, evidence for habitation is even older; 700,000 years ago, ancient butchers on Luzon carved up a rhinoceros with stone tools. Which species did the butchering is unknown.
  • A few “mammal species you find on Luzon appear to have come from the mainland,” Piper said. The Asian continent is 400 or more miles away through the Luzon strait. But in the Middle Pleistocene, when glacial sheets locked up vast amounts of water, sea levels dropped by as much as 400 feet, Piper said.
  • “I would just say that when humans could see land or they could smell it or they knew the signs, that birds were coming from it, they sought it out,” he said. “That’s not a Homo sapiens trait. It’s something our ancestors and extinct relatives had.”
  • The cartoon version of evolution, in which a hunched ape becomes a tall and jaunty biped, suggests a journey with a destination. The reality is messier,
  • An island’s confines can rapidly spark evolutionary change; Charles Darwin saw this in finches’ beaks.
  • “Isolation plays games,” Potts said. Homo floresiensis showed anthropologists that an island could be an “odd little laboratory of human evolution,” he said. These bones reinforce that lesson.
  • “It’s beginning to look like the evolutionary process is really fluid,” Potts said. “And it’s surprising that it is so fluid where each species of Homo may actually be a history or a record.” The result is a fusion of the modern and ancient: molars that could be yours alongside toes with millions-year-old curves.
  • Fifteen years ago, Hawks said, anthropologists chalked up the worldwide success of Homo sapiens to our modern anatomy. These new discoveries, in far-flung corners, suggest exceptionalism is not built into our brains or skeletons.
  • “The archaeological record is now showing us that ancient human forms were much more adaptable, and I would say clever, than we imagined,”
  • “This isn’t ‘Flowers for Algernon,’ where, suddenly, we’re super smart and everyone else in the world is behind us.” Scientists are now plumbing genomes for other clues to Homo sapiens’ survival, looking at our metabolisms or resistance to disease, he said. “I’d say the doors have opened, and we haven’t figured out where they lead.”
Javier E

When scientists saw the mouse heads glowing, they knew the discovery was big - The Wash... - 0 views

  • have found evidence linking problems in the lymphatic and glymphatic systems to Alzheimer’s. In a study on mice, they showed that glymphatic dysfunction contributes to the buildup in the brain of amyloid beta, a protein that plays a key role in the disease.
  • several colleagues examined postmortem tissue from 79 human brains. They focused on aquaporin-4, a key protein in glymphatic vessels. In the brains of people with Alzheimer’s, this protein was jumbled; in those without the disease, the protein was well organized. This suggests that glymphatic breakdowns may play a role in the disease
  • The vessels have also been implicated in autoimmune disease. Researchers knew that the immune system has limited access to the brain. But at the same time, the immune system kept tabs on the brain’s status; no one knew exactly how. Some researchers theorize that the glymphatic system could be the conduit and that in diseases such as multiple sclerosis — where the body’s immune system attacks certain brain cells — the communication may go awry.
  • ...7 more annotations...
  • The system may also play a role in symptoms of traumatic brain injur
  • Mice are a good model, she says, because their glymphatic systems are very similar to humans’. She and Iliff found that even months after being injured, the animals’ brains were still not clearing waste efficiently, leading to a buildup of toxic compounds, including amyloid beta. Nedergaard returns to the dishwasher analogy. “It’s like if you only use a third of the water when you turn on the machine,” she says. “You won’t get clean dishes.”
  • in mice, omega-3 fatty acidsimproved glymphatic functioning.
  • Nedergaard has shown that at least in mice, the system processes twice as much fluid during sleep as it does during wakefulness. She and her colleagues focused on amyloid beta; they found that the lymphatic system removed much more of the protein when the animals were asleep than when they were awake. She suggests that over time, sleep dysfunction may contribute to Alzheimer’s and perhaps other brain illnesses. “You only clean your brain when you’re sleeping,” she says. “This is probably an important reason that we sleep. You need time off from consciousness to do the housekeeping.”
  • Sleeping on your stomach is also not very effective; sleeping on your back is somewhat better, while lying on your side appears to produce the best results.
  • glymphatic flow is significantly decreased in the period just before a migraine. The intense pain in these headaches is caused largely by inflamed nerves in the tissue that surrounds the brain. Neuroscientists Rami Burstein and Aaron Schain, the lead authors, theorize that faulty clearance of molecular waste from the brain could trigger inflammation in these pain fibers.
  • other scientists have found that deep breathing significantly increases the glymphatic transport of cerebrospinal fluid into the brain.
caelengrubb

How Galileo Changed Your Life - Biography - 0 views

  • Galileo’s contributions to the fields of astronomy, physics, mathematics, and philosophy have led many to call him the father of modern science.
  • But his controversial theories, which impacted how we see and understand the solar system and our place within it, led to serious conflict with the Catholic Church and the long-time suppression of his achievements
  • Galileo developed one of the first telescopesGalileo didn’t invent the telescope — it was invented by Dutch eyeglass makers — but he made significant improvements to it.
  • ...18 more annotations...
  • His innovations brought him both professional and financial success. He was given a lifetime tenure position at the University of Padua, where he had been teaching for several years, at double his salary.
  • And he received a contract to produce his telescopes for a group of Venetian merchants, eager to use them as a navigational tool.
  • He helped created modern astronomyGalileo turned his new, high-powered telescope to the sky. In early 1610, he made the first in a remarkable series of discoveries.
  • While the scientific doctrine of the day held that space was perfect, unchanging environments created by God, Galileo’s telescope helped change that view
  • His studies and drawings showed the Moon had a rough, uneven surface that was pockmarked in some places, and was actually an imperfect sphere
  • He was also one of the first people to observe the phenomena known as sunspots, thanks to his telescope which allowed him to view the sun for extended periods of time without damaging the eye.
  • Galileo helped prove that the Earth revolved around the sunIn 1610, Galileo published his new findings in the book Sidereus Nuncius, or Starry Messenger, which was an instant success
  • He became close with a number of other leading scientists, including Johannes Kepler. A German astronomer and mathematician, Kepler’s work helped lay the foundations for the later discoveries of Isaac Newton and others.
  • Kepler’s experiments had led him to support the idea that the planets, Earth included, revolved around the sun. This heliocentric theory, as well as the idea of Earth’s daily rotational turning, had been developed by Polish astronomer Nicolaus Copernicus half a century earlier
  • Their belief that the Sun, and not the Earth, was the gravitational center of the universe, upended almost 2,000 years of scientific thinking, dating back to theories about the fixed, unchanging universe put forth by the Greek philosopher and scientist Aristotle.
  • Galileo had been testing Aristotle’s theories for years, including an experiment in the late 16th century in which he dropped two items of different masses from the Leaning Tower of Pisa, disproving Aristotle’s belief that objects would fall at differing speeds based on their weight (Newton later improved upon this work).
  • Galileo paid a high price for his contributionsBut challenging the Aristotelian or Ptolemaic theories about the Earth’s role in the universe was dangerous stuff.
  • Geocentrism was, in part, a theoretical underpinning of the Roman Catholic Church. Galileo’s work brought him to the attention of Church authorities, and in 1615, he was called before the Roman Inquisition, accused of heresy for beliefs which contradicted Catholic scripture.
  • The following year, the Church banned all works that supported Copernicus’ theories and forbade Galileo from publicly discussing his works.
  • In 1632, after the election of a new pope who he considered more liberal, he published another book, Dialogue on the Two Chief World Systems, Ptolemaic and Copernican, which argued both sides of the scientific (and religious) debate but fell squarely on the side of Copernicus’ heliocentrism.
  • Galileo was once again summoned to Rome. In 1633, following a trial, he was found guilty of suspected heresy, forced to recant his views and sentenced to house arrest until his death in 1642.
  • It took nearly 200 years after Galileo’s death for the Catholic Church to drop its opposition to heliocentrism.
  • In 1992, after a decade-long process and 359 years after his heresy conviction, Pope John Paul II formally expressed the Church’s regret over Galileo’s treatment.
anonymous

Can you trust your earliest childhood memories? - BBC Future - 1 views

  • The moments we remember from the first years of our lives are often our most treasured because we have carried them longest. The chances are, they are also completely made up.
  • Around four out of every 10 of us have fabricated our first memory, according to researchers. This is thought to be because our brains do not develop the ability to store autobiographical memories at least until we reach two years old.
  • Yet a surprising number of us have some flicker of memory from before that age
  • ...23 more annotations...
  • Experts have managed to turn people off all sorts of foods by convincing them it had made them ill when they were a child
  • “People have a life story, particularly as they get older and for some people it needs to stretch back to the very early stage of life,”
  • The prevailing account of how we come to believe and remember things is based around the concept of source monitoring. “Every time a thought comes to mind we have to make a decision – have we experienced it [an event], imagined it or have we talked about it with other people,” says Kimberley Wade
  • Most of the time we make that decision correctly and can identify where these mental experiences come from, but sometimes we get it wrong.
  • Wade admits she has spent a lot of time recalling an event that was actually something her brother experienced rather than herself, but despite this, it is rich in detail and provokes emotion
  • Memory researchers have shown it is possible to induce fictional autobiographical memories in volunteers, including accounts of getting lost in a shopping mall and even having tea with a member of the Royal Family
  • Based on my research, everybody is capable of forming complex false memories, given the right circumstances – Julia Shaw
  • In some situations, such as after looking at pictures or a video, children are more susceptible to forming false memories than adults. People with certain personality types are also thought to be more prone.
  • But carrying around false memories from your childhood could be having a far greater impact on you than you may realise too. The events, emotions and experiences we remember from our early years can help to shape who we are as adults, determining our likes, dislikes, fears and even our behaviour.
  • Memories before the age of three are more than likely to be false. Any that appear very fluid and detailed, as if you were playing back a home video and experiencing a chronological account of a memory, could well also be made up. It is more likely that fuzzy fragments, or snapshots of moments are real, as long as they are not from too early in your life.
  • We crave a cohesive narrative of our own existence and will even invent stories to give us a more complete picture
  • Interestingly, scientists have also found positive suggestions, such as “you loved asparagus the first time you ate it” tend to be more effective than negative suggestions like “you got sick drinking vodka”
  • “Miscarriage of justice, incarceration, loss of reputation, job and status, and family breakdown occur,
  • One of the major problems with legal cases involving false memories, is that it is currently impossible to distinguish between true and fictional recollections
  • Efforts have been made to analyse minor false memories in a brain scanner (fMRI) and detect different neurological patterns, but there is nothing as yet to indicate that this technology can be used to detect whether recollections have become distorted.
  • the most extreme case of memory implantation involves a controversial technique called “regression therapy”, where patients confront childhood traumas, supposedly buried in their subconscious
  • “Memories are malleable and tend to change slightly each time we revisit them, in the same way that spoken stories do,”
  • “Therefore at each recollection, new elements can easily be integrated while existing elements can be altered or lost.”
  • This is not to say that all evidence that relies on memory should be discarded or regarded as unreliable – they often provide the most compelling testimony in criminal cases. But it has led to rules and guidelines about how witnesses and victims should be questioned to ensure their recollections of an event or perpetrator are not contaminated by investigators or prosecutors.
  • Any memories that appear very fluid and detailed, as if you were playing back a home video, could well also be made up
  • While this may seem like a bit of fun, many scientists believe the “false memory diet” could be used to tackle obesity and encourage people to reach for healthier options like asparagus, or even help cut people’s alcohol consumption.
  • Children are more susceptible to forming false memories than adults, especially after looking at photographs or films
  • And we may not want to rid ourselves of these memories. Our memories, whether fictional or not, can help to bring us closer together.
  •  
    This is a great and very detailed article about memory and how we change our own memories and are impacted by this change.
manhefnawi

Just one night of poor sleep can boost Alzheimer's proteins | Science News - 0 views

  • Healthy adults built up Alzheimer’s-associated proteins in their cerebral spinal fluid when prevented from getting slow-wave sleep, the deepest stage of sleep, researchers report July 10 in Brain. Just one night of deep-sleep disruption was enough to increase the amount of amyloid-beta, a protein that clumps into brain cell‒killing plaques in people with Alzheimer’s. People in the study who slept poorly for a week also had more of a protein called tau in their spinal fluid than they did when well rested. Tau snarls itself into tangles inside brain cells of people with the disease.
Javier E

Big Data Is Great, but Don't Forget Intuition - NYTimes.com - 2 views

  • THE problem is that a math model, like a metaphor, is a simplification. This type of modeling came out of the sciences, where the behavior of particles in a fluid, for example, is predictable according to the laws of physics.
  • In so many Big Data applications, a math model attaches a crisp number to human behavior, interests and preferences. The peril of that approach, as in finance, was the subject of a recent book by Emanuel Derman, a former quant at Goldman Sachs and now a professor at Columbia University. Its title is “Models. Behaving. Badly.”
  • A report last year by the McKinsey Global Institute, the research arm of the consulting firm, projected that the United States needed 140,000 to 190,000 more workers with “deep analytical” expertise and 1.5 million more data-literate managers, whether retrained or hired.
  • ...4 more annotations...
  • A major part of managing Big Data projects, he says, is asking the right questions: How do you define the problem? What data do you need? Where does it come from? What are the assumptions behind the model that the data is fed into? How is the model different from reality?
  • Society might be well served if the model makers pondered the ethical dimensions of their work as well as studying the math, according to Rachel Schutt, a senior statistician at Google Research. “Models do not just predict, but they can make things happen,” says Ms. Schutt, who taught a data science course this year at Columbia. “That’s not discussed generally in our field.”
  • the increasing use of software that microscopically tracks and monitors online behavior has raised privacy worries. Will Big Data usher in a digital surveillance state, mainly serving corporate interests?
  • my bigger concern is that the algorithms that are shaping my digital world are too simple-minded, rather than too smart. That was a theme of a book by Eli Pariser, titled “The Filter Bubble: What the Internet Is Hiding From You.”
Javier E

Deluded Individualism - NYTimes.com - 2 views

  • We tend to see ourselves as self-determining, self-conscious agents in all that we decide and do, and we cling to that image. But why? Why do we resist the truth? Why do we wish — strain, strive, against the grain of reality — to be autonomous individuals, and see ourselves as such?
  • why do we presume individual agency in the first place? Why do we insist on it stubbornly, irrationally, often recklessly?
  • though Republicans call for deep cuts to the safety net, their districts rely more on government support than their Democratic counterparts.
  • ...6 more annotations...
  • The Times’s reporters spoke with residents who supported the Tea Party and its proposed cuts to federal spending, even while they admitted they could not get by without government support.
  • the fate of the middle class counties and urban ghettos is entwined. When the poor are left to rot in their misery, the misery does not stay contained. It harms us all. The crime radiates, the misery offends, it debases the whole. Individuals, much less communities, cannot be insulated from it.
  • Thanks to a decades-long safety net, we have forgotten the trials of living without it. This is why, the historian Tony Judt argued, it’s easy for some to speak fondly of a world without government: we can’t fully imagine or recall what it’s like. We can’t really appreciate the horrors Upton Sinclair witnessed in the Chicago slaughterhouses before regulation, or the burden of living without Social Security and Medicare to look forward to. Thus, we can entertain nostalgia for a time when everyone pulled his own weight, bore his own risk, and was the master of his destiny. That time was a myth
  • To be human, according to Spinoza, is to be party to a confounding existential illusion — that human individuals are independent agents — which exacts a heavy emotional and political toll on us. It is the source of anxiety, envy, anger — all the passions that torment our psyche — and the violence that ensues.
  • There is no such thing as a discrete individual, Spinoza points out. This is a fiction. The boundaries of ‘me’ are fluid and blurred. We are all profoundly linked in countless ways we can hardly perceive. My decisions, choices, actions are inspired and motivated by others to no small extent.
  • we’re all in this together. We are not the sole authors of our destiny, each of us; our destinies are entangled — messily, unpredictably. Our cultural demands of individualism are too extreme. They are constitutionally irrational, Spinoza and Freud tell us, and their potential consequences are disastrous. Thanks to our safety net, we live in a society that affirms the dependence and interdependence of all. To that extent, it affirms a basic truth of our nature. We forsake it at our own peril.
Javier E

Coursekit Raises $5 Million to Reinvent the Classroom - NYTimes.com - 0 views

  • Coursekit is a new tool that lets teachers and educators create mini social networks around individual courses and lectures.
  • The goal of the service, said Joseph Cohen, its co-founder and chief executive, is to take some of the most successful elements of social networking — especially the fluid exchange of ideas that comes natural to online interactions — to revitalize the education experience. Students are already accustomed to interacting online and supplementing their daily lives with the Web and social media. Why should that stop when it comes to learning? “Our education experience is truly offline,” he said. “We want to build what Facebook has done for your personal life, but for your school.”
  • Using Coursekit’s software, teachers can upload homework assignments, answer questions, grade work and facilitate discussions with their students. In addition, students can use the software to chat with one another, collaborate on projects and share relevant materials with their classmates
  • ...1 more annotation...
  • Coursekit is free to both the instructors and students that want to have access to it. The company says its main focus is attracting users, not making money
Javier E

Study Shows Why Lawyers Are So Smart - WSJ.com - 2 views

  • The research team performed brain scans on 24 college students and recent graduates, both before and after they spent 100 hours studying for the LSAT over a three-month period. The researchers also scanned 23 young adults who didn't study for the test. For those who studied, the results showed increased connectivity between the frontal lobes of the brain, as well as between the frontal and parietal lobes, which are parts of the brain associated with reasoning and thinking.
  • The study focused on fluid reasoning—the ability to tackle a novel problem—which is a central part of IQ tests and can to some degree predict academic performance or ability in demanding careers.
  • "People assume that IQ tests measure some stable characteristic of an individual, but we think this whole assumption is flawed," said Silvia Bunge, the study's senior author. "We think the skills measured by an IQ test wax and wane over time depending on the individual's level of cognitive activity."
Sophia C

BBC News - Viewpoint: Human evolution, from tree to braid - 0 views

  • What was, in my view, a logical conclusion reached by the authors was too much for some researchers to take.
  • he conclusion of the Dmanisi study was that the variation in skull shape and morphology observed in this small sample, derived from a single population of Homo erectus, matched the entire variation observed among African fossils ascribed to three species - H. erectus, H. habilis and H. rudolfensis.
  • a single population of H. erectus,
  • ...13 more annotations...
  • They all had to be the same species.
  • was not surprising to find that Neanderthals and modern humans interbred, a clear expectation of the biological species concept.
  • I wonder when the penny will drop: when we have five pieces of a 5,000-piece jigsaw puzzle, every new bit that we add is likely to change the picture.
  • e identity of the fourth player remains unknown but it was an ancient lineage that had been separate for probably over a million years. H. erectus seems a likely candidate. Whatever the name we choose to give this mystery lineage, what these results show is that gene flow was possible not just among contemporaries but also between ancient and more modern lineages.
  • cientists succeeded in extracting the most ancient mitochondrial DNA so far, from the Sima de los Huesos site in Atapuerca, Spain.
  • We have built a picture of our evolution based on the morphology of fossils and it was wrong.
    • Sophia C
       
      Kuhn
  • when we know how plastic - or easily changeable - skull shape is in humans. And our paradigms must also change.
  • e must abandon, once and for all, views of modern human superiority over archaic (ancient) humans. The terms "archaic" and "modern" lose all meaning as do concepts of modern human replacement of all other lineages.
  • he deep-rooted shackles that have sought to link human evolution with stone tool-making technological stages - the Stone Ages - even when we have known that these have overlapped with each other for half-a-million years in some instances.
  • e world of our biological and cultural evolution was far too fluid for us to constrain it into a few stages linked by transitions.
  • We have to flesh out the genetic information and this is where archaeology comes into the picture.
  • Rather than focus on differences between modern humans and Neanderthals, what the examples show is the range of possibilities open to humans (Neanderthals included) in different circumstances.
  • research using new technology on old archaeological sites, as at La Chapelle; and
Javier E

Does Google Make Us Stupid? - Pew Research Center - 0 views

  • Carr argued that the ease of online searching and distractions of browsing through the web were possibly limiting his capacity to concentrate. "I'm not thinking the way I used to," he wrote, in part because he is becoming a skimming, browsing reader, rather than a deep and engaged reader. "The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas.... If we lose those quiet spaces, or fill them up with ‘content,' we will sacrifice something important not only in our selves but in our culture."
  • force us to get smarter if we are to survive. "Most people don't realize that this process is already under way," he wrote. "In fact, it's happening all around us, across the full spectrum of how we understand intelligence. It's visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity." He argued that while the proliferation of technology and media can challenge humans' capacity to concentrate there were signs that we are developing "fluid intelligence-the ability to find meaning in confusion and solve new problems, independent of acquired knowledge." He also expressed hope that techies will develop tools to help people find and assess information smartly.
  • 76% of the experts agreed with the statement, "By 2020, people's use of the internet has enhanced human intelligence; as people are allowed unprecedented access to more information they become smarter and make better choices. Nicholas Carr was wrong: Google does not make us stupid."
Javier E

Older Really Can Mean Wiser - NYTimes.com - 0 views

  • mental faculties that improve with age.
  • Knowledge is a large part of the equation, of course. People who are middle-aged and older tend to know more than young adults, by virtue of having been around longer, and score higher on vocabulary tests, crossword puzzles and other measures of so-called crystallized intelligence.
  • the older brain offers something more, according to a new paper in the journal Psychological Science. Elements of social judgment and short-term memory, important pieces of the cognitive puzzle, may peak later in life than previously thought.
  • ...15 more annotations...
  • The researchers found that the broad split in age-related cognition — fluid in the young, crystallized in the old — masked several important nuances.
  • A year ago, German scientists argued that cognitive “deficits” in aging were caused largely by the accumulation of knowledge — that is, the brain slows down because it has to search a larger mental library of facts
  • Experts said the new analysis raised a different question: Are there distinct, independent elements of memory and cognition that peak at varying times of life?
  • The strength of the new analysis is partly in its data. The study evaluated historic scores from the popular Wechsler intelligence test, and compared them with more recent results from tens of thousands of people who took short cognitive tests on the authors’ websites, testmybrain.org and gameswithwords.org
  • The one drawback of this approach is that, because it didn’t follow the same people over a lifetime, it might have missed the effect of different cultural experiences
  • most previous studies have not been nearly as large, or had such a range of ages. Participants on the websites were 10 to 89 years old, and they took a large battery of tests, measuring skills like memory for abstract symbols and strings of digits, problem solving, and facility reading emotions from strangers’ eyes.
  • “We found different abilities really maturing or ripening at different ages,” Dr. Germine said. “It’s a much richer picture of the life span than just calling it aging.”
  • At least as important, the researchers looked at the effect of age on each type of test.
  • Processing speed — the quickness with which someone can manipulate digits, words or images, as if on a mental sketch board — generally peaks in the late teens
  • memory for some things, like names, does so in the early 20s
  • But the capacity of that sketch board, called working memory, peaks at least a decade later and is slow to decline. In particular, the ability to recall faces and do some mental manipulation of numbers peaked about age 30,
  • The researchers also analyzed results from the Reading the Mind in the Eyes test. The test involves looking at snapshots of strangers’ eyes on a computer screen and determining their moods from a menu of options like “tentative,” “uncertain” and “skeptical.”
  • people in their 40s or 50s consistently did the best, the study found, and the skill declined very slowly later in life
  • The picture that emerges from these findings is of an older brain that moves more slowly than its younger self, but is just as accurate in many areas and more adept at reading others’ moods — on top of being more knowledgeable. That’s a handy combination, given that so many important decisions people make intimately affects others.
  • for now, the new research at least gives some meaning to the empty adjective “wily.”
Javier E

The Fall of Facebook - The Atlantic - 0 views

  • Alexis C. Madrigal Nov 17 2014, 7:59 PM ET Social networking is not, it turns out, winner take all. In the past, one might have imagined that switching between Facebook and “some other network” would be difficult, but the smartphone interface makes it easy to be on a dozen networks. All messages come to the same place—the phone’s notifications screen—so what matters is what your friends are doing, not which apps they’re using.
  • if I were to put money on an area in which Facebook might be unable to dominate in the future, it would be apps that take advantage of physical proximity. Something radically new could arise on that front, whether it’s an evolution of Yik Yak
  • The Social Machine, predicts that text will be a less and less important part of our asynchronous communications mix. Instead, she foresees a “very fluid interface” that would mix text with voice, video, sensor outputs (location, say, or vital signs), and who knows what else
  • ...5 more annotations...
  • the forthcoming Apple Watch seems like a step toward the future Donath envisions. Users will be able to send animated smiley faces, drawings, voice snippets, and even their live heartbeats, which will be tapped out on the receiver’s wrist.
  • A simple but rich messaging platform—perhaps with specialized hardware—could replace the omnibus social network for most purposes. “I think we’re shifting in a weird way to one-on-one conversations on social networks and in messaging apps,” says Shani Hilton, the executive editor for news at BuzzFeed, the viral-media site. “People don’t want to perform their lives publicly in the same way that they wanted to five years ago.”
  • Facebook is built around a trade-off that it has asked users to make: Give us all your personal information, post all your pictures, tag all your friends, and so on, forever. In return, we’ll optimize your social life. But this output is only as good as the input. And it turns out that, when scaled up, creating this input—making yourself legible enough to the Facebook machine that your posts are deemed “relevant” and worthy of being displayed to your mom and your friends—is exhausting labor.
  • These new apps, then, are arguments that we can still have an Internet that is weird, and private. That we can still have social networks without the social network. And that we can still have friends on the Internet without “friending” them.
  • A Brief History of Information Gatekeepers 1871: Western Union controls 90 percent of U.S. telegraph traffic. 1947: 97 percent of the country’s radio stations are affiliated with one of four national networks. 1969: Viewership for the three nightly network newscasts hits an all-time high, with 50 percent of all American homes tuning in. 1997: About half of all American homes with Internet access get it through America Online. 2002: Microsoft Internet Explorer captures 97 percent of the worldwide browser market. 2014: Amazon sells 63 percent of all books bought online—and 40 percent of books overall.
Javier E

Opinion | Is There Such a Thing as an Authoritarian Voter? - The New York Times - 0 views

  • Jonathan Weiler, a political scientist at the University of North Carolina at Chapel Hill, has spent much of his career studying the appeal of authoritarian figures: politicians who preach xenophobia, beat up on the press and place themselves above the law while extolling “law and order” for everyone else.
  • He is one of many scholars who believe that deep-seated psychological traits help explain voters’ attraction to such leaders. “These days,” he told me, “audiences are more receptive to the idea” than they used to be.
  • “In 2018, the sense of fear and panic — the disorientation about how people who are not like us could see the world the way they do — it’s so elemental,” Mr. Weiler said. “People understand how deeply divided we are, and they are looking for explanations that match the depth of that division.”
  • ...24 more annotations...
  • Moreover, using the child-rearing questionnaire, African-Americans score as far more authoritarian than whites
  • what, exactly, is an “authoritarian” personality? How do you measure it?
  • for more than half a century — social scientists have tried to figure out why some seemingly mild-mannered people gravitate toward a strongman
  • the philosopher (and German refugee) Theodor Adorno collaborated with social scientists at the University of California at Berkeley to investigate why ordinary people supported fascist, anti-Semitic ideology during the war. They used a questionnaire called the F-scale (F is for fascism) and follow-up interviews to analyze the “total personality” of the “potentially antidemocratic individual.”
  • The resulting 1,000-page tome, “The Authoritarian Personality,” published in 1950, found that subjects who scored high on the F-scale disdained the weak and marginalized. They fixated on sexual deviance, embraced conspiracy theories and aligned themselves with domineering leaders “to serve powerful interests and so participate in their power,”
  • “Globalized free trade has shafted American workers and left us looking for a strong male leader, a ‘real man,’” he wrote. “Trump offers exactly what my maladapted unconscious most craves.”
  • one of the F-scale’s prompts: “Obedience and respect for authority are the most important virtues children should learn.” Today’s researchers often diagnose latent authoritarians through a set of questions about preferred traits in children: Would you rather your child be independent or have respect for elders? Have curiosity or good manners? Be self-reliant or obedient? Be well behaved or considerate?
  • a glance at the Christian group Focus on the Family’s “biblical principles for spanking” reminds us that your approach to child rearing is not pre-political; it is shorthand for your stance in the culture wars.
  • “All the social sciences are brought to bear to try to explain all the evil that persists in the world, even though the liberal Enlightenment worldview says that we should be able to perfect things,” said Mr. Strouse, the Trump voter
  • what should have been obvious:
  • “Trump’s electoral strength — and his staying power — have been buoyed, above all, by Americans with authoritarian inclinations,” wrote Matthew MacWilliams, a political consultant who surveyed voters during the 2016 election
  • The child-trait test, then, is a tool to identify white people who are anxious about their decline in status and power.
  • new book, “Prius or Pickup?,” by ditching the charged term “authoritarian.” Instead, they divide people into three temperamental camps: fixed (people who are wary of change and “set in their ways”), fluid (those who are more open to new experiences and people) and mixed (those who are ambivalent).
  • “The term ‘authoritarian’ connotes a fringe perspective, and the perspective we’re describing is far from fringe,” Mr. Weiler said. “It’s central to American public opinion, especially on cultural issues like immigration and race.”
  • Other scholars apply a typology based on the “Big Five” personality traits identified by psychologists in the mid-20th century: extroversion, agreeableness, conscientiousness, neuroticism and openness to experience. (It seems that liberals are open but possibly neurotic, while conservatives are more conscientious.)
  • Historical context matters — it shapes who we are and how we debate politics. “Reason moves slowly,” William English, a political economist at Georgetown, told me. “It’s constituted sociologically, by deep community attachments, things that change over generations.”
  • “it is a deep-seated aspiration of many social scientists — sometimes conscious and sometimes unconscious — to get past wishy-washy culture and belief. Discourses that can’t be scientifically reduced are problematic” for researchers who want to provide “a universal account of behavior.”
  • in our current environment, where polarization is so unyielding, the apparent clarity of psychological and biological explanations becomes seductive
  • Attitudes toward parenting vary across cultures, and for centuries African-Americans have seen the consequences of a social and political hierarchy arrayed against them, so they can hardly be expected to favor it — no matter what they think about child rearing
  • — we know that’s not going to happen. People have wicked tendencies.”
  • as the social scientific portrait of humanity grows more psychological and irrational, it comes closer and closer to approximating the old Adam of traditional Christianity: a fallen, depraved creature, unable to see himself clearly except with the aid of a higher power
  • The conclusions of political scientists should inspire humility rather than hubris. In the end, they have confirmed what so many observers of our species have long suspected: None of us are particularly free or rational creatures.
  • Allen Strouse is not the archetypal Trump voter whom journalists discover in Rust Belt diners. He is a queer Catholic poet and scholar of medieval literature who teaches at the New School in New York City. He voted for Mr. Trump “as a protest against the Democrats’ failures on economic issues,” but the psychological dimensions of his vote intrigue him. “Having studied Freudian analysis, and being in therapy for 10 years, I couldn’t not reflexively ask myself, ‘How does this decision have to do with my psychology?’” he told me.
  • their preoccupation with childhood and “primitive and irrational wishes and fears” have influenced the study of authoritarianism ever since.
1 - 20 of 37 Next ›
Showing 20 items per page