Skip to main content

Home/ TOK Friends/ Group items tagged aging

Rss Feed Group items tagged

tonycheng6

Anti-ageing effects of protein restriction unpacked - 0 views

  • The idea that dietary restriction can be used as a tool to increase lifespan has been a centrepiece of ageing research for decades. But the mechanisms by which dietary restriction might act, and the specific nutritional components involved, remain unclear.
  • Both dietary protein restriction (which results in low levels of leucine and other BCAAs) and inhibition of mTOR can extend lifespan in animals
  • Flies that carry a mutation in this residue have lower mTOR activity than do controls. They are also longer-lived, and are protected against the negative lifespan-shortening effects of a high-protein diet.
  • ...6 more annotations...
  • In contrast to a previous study6, they observed a robust lifespan extension in male mice fed a BCAA-restricted diet throughout life, equal to the benefits of dietary protein restriction.
  • Interestingly, female mice showed no lifespan extension from BCAA or dietary protein restriction, and if BCAA restriction was started during middle age, the benefits on males were greatly reduced. Thus, both studies collectively point to mTOR as a primary mediator of the benefits associated with BCAA restriction (Fig. 1).
  • A clear picture is emerging of how specific amino acids are sensed by sestrin to regulate mTOR signalling and autophagy and so preserve the function of intestinal stem cells during ageing.
  • Genetic background is crucial in the response to dietary restriction, with an identical low-calorie regimen increasing lifespan in some mouse strains but shortening it in others
  • There is also evidence that dietary restriction initiated later in life might have reduced benefits in rodents and, in some cases, result in premature death
  • Taken together, these observations suggest that although protein- and BCAA-restricted diets are a powerful research tool for exploring the fundamental mechanisms of ageing, it is premature to recommend adoption by the general population.
lucieperloff

New Study Suggests COVID-19 May Age Some Patients' Brains By 10 Years | HuffPost Life - 0 views

  • People recovering from COVID-19 may suffer significant brain function impacts, with the worst cases of the infection linked to mental decline equivalent to the brain aging by 10 years
  • People recovering from COVID-19 may suffer significant brain function impacts, with the worst cases of the infection linked to mental decline equivalent to the brain aging by 10 years
    • lucieperloff
       
      Permanent deficits? Or do these go away?
  • “Our analyses ... align with the view that there are chronic cognitive consequences of having COVID-19,” the researchers wrote in a report of their findings.
    • lucieperloff
       
      This is more than just a cold. There will likely also be more discovered about the disease as more people recover
  • ...2 more annotations...
  • The cognitive deficits were “of substantial effect size,“particularly among people who had been hospitalized with COVID-19, the researchers said, with the worst cases showing impacts “equivalent to the average 10-year decline in global performance between the ages of 20 to 70.”
  • the study’s findings could not be entirely reliable, since they did not compare before and after scores, and involved a large number of people who self-reported having had COVID-19, who had no positive test.
Javier E

Scientists Can No Longer Ignore Ancient Flooding Tales - The Atlantic - 0 views

  • It wasn’t long after Henry David Inglis arrived on the island of Jersey, just northwest of France, that he heard the old story. Locals eagerly told the 19th-century Scottish travel writer how, in a bygone age, their island had been much more substantial, and that folks used to walk to the French coast. The only hurdle to their journey was a river—one easily crossed using a short bridge.
  • there had been a flood. A big one. Between roughly 15,000 and 6,000 years ago, massive flooding caused by melting glaciers raised sea levels around Europe. That flooding is what eventually turned Jersey into an island.
  • Rather than being a ridiculous claim not worthy of examination, perhaps the old story was true—a whisper from ancestors who really did walk through now-vanished lands
  • ...8 more annotations...
  • That’s exactly what the geographer Patrick Nunn and the historian Margaret Cook at the University of the Sunshine Coast in Australia have proposed in a recent paper.
  • In their work, the pair describe colorful legends from northern Europe and Australia that depict rising waters, peninsulas becoming islands, and receding coastlines during that period of deglaciation thousands of years ago. Some of these stories, the researchers say, capture historical sea-level rise that actually happened—often several thousand years ago. For scholars of oral history, that makes them geomyths.
  • “The first time I read an Aboriginal story from Australia that seemed to recall the rise of sea levels after the last ice age, I thought, No, I don’t think this is correct,” Nunn says. “But then I read another story that recalled the same thing.
  • For Jo Brendryen, a paleoclimatologist at the University of Bergen in Norway who has studied the effects of deglaciation in Europe following the end of the last ice age, the idea that traditional oral histories preserve real accounts of sea-level rise is perfectly plausible.
  • During the last ice age, he says, the sudden melting of ice sheets induced catastrophic events known as meltwater pulses, which caused sudden and extreme sea-level rise. Along some coastlines in Europe, the ocean may have risen as much as 10 meters in just 200 years. At such a pace, it would have been noticeable to people across just a few human generations.
  • “These are stories based in trauma, based in catastrophe.”
  • That, he suggests, is why it may have made sense for successive generations to pass on tales of geological upheaval. Ancient societies may have sought to broadcast their warning: Beware, these things can happen!
  • the old stories still have things to teach us. As Nunn says, “The fact that our ancestors have survived those periods gives us hope that we can survive this.”
Javier E

You Won't Stay the Same, Study Finds - NYTimes.com - 0 views

  • When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years.
  • when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.
  • They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement
  • ...5 more annotations...
  • At every age we think we’re having the last laugh, and at every age we’re wrong.”
  • the discrepancy did not seem to be because of faulty memories, because the personality changes recalled by people jibed quite well with independent research charting how personality traits shift with age. People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.
  • a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness. “Believing that we just reached the peak of our personal evolution makes us feel good,”
  • Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,”
  • “The end-of-history effect may represent a failure in personal imagination,” said Dan P. McAdams, a psychologist at Northwestern who has done separate research into the stories people construct about their past and future lives. He has often heard people tell complex, dynamic stories about the past but then make vague, prosaic projections of a future in which things stay pretty much the same.
lenaurick

IQ can predict your risk of death, and 8 other smart facts about intelligence - Vox - 0 views

  • But according to Stuart Ritchie, an intelligence researcher at the University of Edinburgh, there's a massive amount of data showing that it's one of the best predictors of someone's longevity, health, and prosperity
  • In a new book, Intelligence: All that Matters, Ritchie persuasively argues that IQ doesn't necessarily set the limit for what we can do, but it does give us a starting point
  • Most people you meet are probably average, and a few are extraordinarily smart. Just 2.2 percent have an IQ of 130 or greate
  • ...17 more annotations...
  • "The classic finding — I would say it is the most replicated finding in psychology — is that people who are good at one type of mental task tend to be good at them all,"
  • G-factor is real in the sense it can predict outcomes in our lives — how much money you'll make, how productive of a worker you might be, and, most chillingly, how likely you are to die an earlier death.
  • According to the research, people with high IQs tend to be healthier and live longer than the rest of us
  • One is the fact that people with higher IQs tend to make more money than people with lower scores. Money is helpful in maintaining weight, nutrition, and accessing good health care.
  • IQ often beats personality when it comes to predicting life outcomes: Personality traits, a recent study found, can explain about 4 percent of the variance in test scores for students under age 16. IQ can explain 25 percent, or an even higher proportion, depending on the study.
  • Many of these correlations are less than .5, which means there's plenty of room for individual differences. So, yes, very smart people who are awful at their jobs exist. You're just less likely to come across them.
  • The correlation between IQ and happiness is usually positive, but also usually smaller than one might expect (and sometimes not statistically significant)," Ritchie says.
  • It could also be that people with higher IQs are smart enough to avoid accidents and mishaps. There's actually some evidence to support this: Higher-IQ people are less likely to die in traffic accidents.
  • Even though intelligence generally declines with age, those who had high IQs as children were most likely to retain their smarts as very old people.
  • "If we know the genes related to intelligence — and we know these genes are related to cognitive decline as well — then we can start to a predict who is going to have the worst cognitive decline, and devote health care medical resources to them," he says.
  • Studies comparing identical and fraternal twins find about half of IQ can be explained by genetics.
  • genetics seems to become more predictive of IQ with age.
  • The idea is as we age, we grow more in control of our environments. Those environments we create can then "amplify" the potential of our genes.
  • About half the variability in IQ is attributed to the environment. Access to nutrition, education, and health care appear to play a big role.
  • People’s lives are really messy, and the environments they are in are messy. There’s a possibility that a lot of the environmental effect on a person’s intelligence is random."
  • Hurray! Mean IQ scores appear to be increasing between 2 and 3 points per decade.
  • This phenomenon is know as the Flynn effect, and it is likely the result of increasing quality of childhood nutrition, health care, and education.
lenaurick

Mediterranean diet may slow aging of the brain - CNN.com - 0 views

  • As we age, our brains naturally shrink and our risk of having a stroke, dementia or Alzheimer's rise, and almost everyone experiences some kind of memory loss
  • Scientists know that people who exercise regularly, eat a healthy diet, avoid smoking and keep mentally stimulated generally have healthier brains
  • Researchers figured this out by looking at the brains of 674 people with an average age of 80. They asked these elderly people to fill out food surveys about what they ate in the last year and researchers scanned their brains. The group that ate a Mediterranean diet had heavier brains with more gray and white matter.
  • ...4 more annotations...
  • In this study, a higher consumption of fish seemed to make a big difference in keeping your brain young.
  • People who ate a diet close to the MIND diet saw a 53% lower risk of developing Alzheimer's.
  • Even people who ate the MIND diet "most" (as opposed to "all") of the time saw a 35% reduced chance of developing the disease.
  • It has also been shown as a key to helping you live longer. It helps you manage your weight better and can lower your risk for cancer, and cardiovascular diseases.
kortanekev

Unbelievable: Why Americans Mistrust Science | SciBytes | Learn Science at Scitable - 0 views

  • 25% of American respondents answered that the sun orbits the Earth [1]. A recent AP-GFK poll found that as many as 4 in 10 American adults doubt evolution, over half aren't confident that the Big Bang took place, just under 40% don't believe that pollution is causing climate change
  • When teaching science, teachers have to contend with students' old ideas of how the world works, an uphill battle where the old ideas have the advantage.
  • Children are exposed to scientific ideas at around age eight, when they become able to understand abstract concepts. Before that age, children rely on "magical thinking" to explain how the world exists and works, so, science education faces a tough challenge right from the start
  •  
    Interesting - before the age of 8 most children are unable to understand very abstract concepts and resort to forming ideas that make sense symbolically to them - "magical thinking." So we do not have fundamentally "scientific" minds, and instead must rewrite our preconceived notions.  (Evie - 12/6/16) 
sissij

How Social Isolation Is Killing Us - The New York Times - 0 views

  • About one-third of Americans older than 65 now live alone, and half of those over 85 do. People in poorer health — especially those with mood disorders like anxiety and depression — are more likely to feel lonely. Those without a college education are the least likely to have someone they can talk to about important personal matters.
  • Loneliness can accelerate cognitive decline in older adults, and isolated individuals are twice as likely to die prematurely as those with more robust social interactions. These effects start early: Socially isolated children have significantly poorer health 20 years later, even after controlling for other factors. All told, loneliness is as important a risk factor for early death as obesity and smoking.
  • The loneliness of older adults has different roots — often resulting from family members moving away and close friends passing away. As one senior put it, “Your world dies before you do.”
  • ...3 more annotations...
  • “In America, you almost need an excuse for knocking on a neighbor’s door,” Dr. Tang told me. “We want to break down those barriers.”
  • “You don’t need a playmate every day,” Dr. Tang said. “But knowing you’re valued and a contributing member of society is incredibly reaffirming.” Advertisement Continue reading the main story
  • A great paradox of our hyper-connected digital age is that we seem to be drifting apart. Increasingly, however, research confirms our deepest intuition: Human connection lies at the heart of human well-being. It’s up to all of us — doctors, patients, neighborhoods and communities — to maintain bonds where they’re fading, and create ones where they haven’t existed.
  •  
    We are always finding reasons to do something good for others. However, these barriers are just invented. We don't need reason to be kind and friendly. The digital age gives us access to more people, but it also limits attention and effort to form a friendship. Our brain is limited so there is attention blindness showing that we cannot handle too much information. What we should do is focus our attention on people in our community and actually make effort to form relationships and connections. Usually, people that's most realistic and close to us are not online. --Sissi (12/23/2016)
Javier E

The Age of Niallism: Ferguson and the Post-Fact World - Matthew O'Brien - The Atlantic - 0 views

  • Ferguson gets some facts wrong. Ferguson gets some facts right, but frames them incompletely. Why the outrage? Because he's treating facts as low-grade and cheap materials that are meant to be bent, spliced and morphed for the purpose of building a sensational polemic. Even more outrageous is that his bosses didn't mind enough to force him to make an honest argument, or even profess embarrassment when its dishonesty came to light.
  • it's not just Ferguson. There is an epidemic of Niallism -- which Seamus McKiernan of the Huffington Post defined as not believing in anything factual. It's the idea that bluster can make untruths true through mere repetition. 
  • We live in a post-truth age. That's the term David Roberts of Grist coined to describe the way the way lies get amplified in our media ecosystem.
abby deardorff

Musical Training Optimizes Brain Function | Psychology Today - 0 views

  • Three Brain Benefits of Musical Training:
  • musical training can have a huge impact on the developing brain
  • systematic training actually helped improve brain areas related to music improvisation.
  • ...3 more annotations...
  • training before the age of 7 years results in changes in white-matter connectivity that may serve as a solid scaffolding upon which ongoing experience can maintain a well-connected brain infrastructure into adulthood.
  • musical training improves the function and connectivity of different brain regions. Musical training increases brain volume and strengthens communication between brain areas. Playing an instrument changes how the brain interprets and integrates a wide range of sensory information, especially for those who start before age 7.
  • Musicians have an enhanced ability to integrate sensory information from hearing, touch, and sight.The age at which musical training begins affects brain anatomy as an adult; beginning training before the age of seven has the greatest impact.Brain circuits involved in musical improvisation are shaped by systematic training, leading to less reliance on working memory and more extensive connectivity within the brain.
qkirkpatrick

BBC News - Study investigates how brain deteriorates with age - 0 views

  • People in their 70s are helping scientists in Edinburgh investigate the effects of ageing on the brain.
  • Part of the study sees volunteers re-take tests that they first carried out when they were young children, to see how their mental abilities have changed.
  •  
    Study shows how Brain deteriorates with age
Javier E

Mortality and its Discontents - NYTimes.com - 0 views

  • the real tragedy is not every click of the postpartum clock, but how we have come to see aging as a disease.
  • Gawande makes the point that we’ve got to get over the idea that aging is a disease. “People live longer and better than any time in history,” Gawande writes. “But scientific advances have turned the process of aging and dying into medical experiences.” And he concludes, “Death of course is not failure. Death is normal.”
Javier E

Geology's Timekeepers Are Feuding - The Atlantic - 0 views

  • , in 2000, the Nobel Prize-winning chemist Paul Crutzen won permanent fame for stratigraphy. He proposed that humans had so throughly altered the fundamental processes of the planet—through agriculture, climate change, and nuclear testing, and other phenomena—that a new geological epoch had commenced: the Anthropocene, the age of humans.
  • Zalasiewicz should know. He is the chair of the Anthropocene working group, which the ICS established in 2009 to investigate whether the new epoch deserved a place in stratigraphic time.
  • In 2015, the group announced that the Anthropocene was a plausible new layer and that it should likely follow the Holocene. But the team has yet to propose a “golden spike” for the epoch: a boundary in the sedimentary rock record where the Anthropocene clearly begins.
  • ...12 more annotations...
  • Officially, the Holocene is still running today. You have lived your entire life in the Holocene, and the Holocene has constituted the geological “present” for as long as there have been geologists.But if we now live in a new epoch, the Anthropocene, then the ICS will have to chop the Holocene somewhere. It will have to choose when the Holocene ended, and it will move some amount of time out of the purview of the Holocene working group and into that of the Anthropocene working group.
  • This is politically difficult. And right now, the Anthropocene working group seems intent on not carving too deep into the Holocene. In a paper published earlier this year in Earth-Science Reviews, the Anthropocene working group’s members strongly imply that they will propose starting the new epoch in the mid-20th century.
  • Some geologists argue that the Anthropocene started even earlier: perhaps 4,000 or 6,000 years ago, as farmers began to remake the land surface.“Most of the world’s forests that were going to be converted to cropland and agriculture were already cleared well before 1950,” says Bill Ruddiman, a geology professor at the University of Virginia and an advocate of this extremely early Anthropocene.
  • “Most of the world’s prairies and steppes that were going to be cleared for crops were already gone, by then. How can you argue the Anthropocene started in 1950 when all of the major things that affect Earth’s surface were already over?”Van der Pluijm agreed that the Anthropocene working group was picking 1950 for “not very good reasons.”“Agriculture was the revolution that allowed society to develop,” he said. “That was really when people started to force the land to work for them. That massive land movement—it’s like a landslide, except it’s a humanslide. And it is not, of course, as dramatic as today’s motion of land, but it starts the clock.”
  • This muddle had to stop. The Holocene comes up constantly in discussions of modern global warming. Geologists and climate scientists did not make their jobs any easier by slicing it in different ways and telling contradictory stories about it.
  • This process started almost 10 years ago. For this reason, Zalasiewicz, the chair of the Anthropocene working group, said he wasn’t blindsided by the new subdivisions at all. In fact, he voted to adopt them as a member of the Quaternary working group.“Whether the Anthropocene works with a unified Holocene or one that’s in three parts makes for very little difference,” he told me.In fact, it had made the Anthropocene group’s work easier. “It has been useful to compare the scale of the two climate events that mark the new boundaries [within the Holocene] with the kind of changes that we’re assessing in the Anthropocene. It has been quite useful to have the compare and contrast,” he said. “Our view is that some of the changes in the Anthropocene are rather bigger.”
  • Zalasiewicz said that he and his colleagues were going as fast as they could. When the working group group began its work in 2009, it was “really starting from scratch,” he told me.While other working groups have a large body of stratigraphic research to consider, the Anthropocene working group had nothing. “We had to spend a fair bit of time deciding whether the Anthropocene was geology at all,” he said. Then they had to decide where its signal could show up. Now, they’re looking for evidence that shows it.
  • This cycle of “glacials” and “interglacials” has played out about 50 times over the last several million years. When the Holocene began, it was only another interglacial—albeit the one we live in. Until recently, glaciers were still on schedule to descend in another 30,000 years or so.Yet geologists still call the Holocene an epoch, even though they do not bestow this term on any of the previous 49 interglacials. It get special treatment because we live in it.
  • Much of this science is now moot. Humanity’s vast emissions of greenhouse gas have now so warmed the climate that they have offset the next glaciation. They may even knock us out of the ongoing cycle of Ice Ages, sending the Earth hurtling back toward a “greenhouse” climate after the more amenable “icehouse” climate during which humans evolved.For this reason, van der Pluijm wants the Anthropocene to supplant the Holocene entirely. Humans made their first great change to the environment at the close of the last glaciation, when they seem to have hunted the world’s largest mammals—the wooly mammoth, the saber-toothed tiger—to extinction. Why not start the Anthropocene then?He would even rename the pre-1800 period “the Holocene Age” as a consolation prize:
  • Zalasiewicz said he would not start the Anthropocene too early in time, as it would be too work-intensive for the field to rename such a vast swath of time. “The early-Anthropocene idea would crosscut against the Holocene as it’s seen by Holocene workers,” he said. If other academics didn’t like this, they could create their own timescales and start the Anthropocene Epoch where they choose. “We have no jurisdiction over the word Anthropocene,” he said.
  • Ruddiman, the University of Virginia professor who first argued for a very early Anthropocene, now makes an even broader case. He’s not sure it makes sense to formally define the Anthropocene at all. In a paper published this week, he objects to designating the Anthropocene as starting in the 1950s—and then he objects to delineating the Anthropocene, or indeed any new geological epoch, by name. “Keep the use of the term informal,” he told me. “Don’t make it rigid. Keep it informal so people can say the early-agricultural Anthropocene, or the industrial-era Anthropocene.”
  • “This is the age of geochemical dating,” he said. Geologists have stopped looking to the ICS to place each rock sample into the rock sequence. Instead, field geologists use laboratory techniques to get a precise year or century of origin for each rock sample. “The community just doesn’t care about these definitions,” he said.
Javier E

A New Dark Age Looms - The New York Times - 1 views

  • picture yourself in our grandchildren’s time, a century hence. Significant global warming has occurred, as scientists predicted. Nature’s longstanding, repeatable patterns — relied on for millenniums by humanity to plan everything from infrastructure to agriculture — are no longer so reliable. Cycles that have been largely unwavering during modern human history are disrupted by substantial changes in temperature and precipitation.
  • As Earth’s warming stabilizes, new patterns begin to appear. At first, they are confusing and hard to identify. Scientists note similarities to Earth’s emergence from the last ice age. These new patterns need many years — sometimes decades or more — to reveal themselves fully, even when monitored with our sophisticated observing systems
  • Disruptive societal impacts will be widespread.
  • ...9 more annotations...
  • Our foundation of Earth knowledge, largely derived from historically observed patterns, has been central to society’s progress. Early cultures kept track of nature’s ebb and flow, passing improved knowledge about hunting and agriculture to each new generation. Science has accelerated this learning process through advanced observation methods and pattern discovery techniques. These allow us to anticipate the future with a consistency unimaginable to our ancestors.
  • But as Earth warms, our historical understanding will turn obsolete faster than we can replace it with new knowledge. Some patterns will change significantly; others will be largely unaffected
  • The list of possible disruptions is long and alarming.
  • Historians of the next century will grasp the importance of this decline in our ability to predict the future. They may mark the coming decades of this century as the period during which humanity, despite rapid technological and scientific advances, achieved “peak knowledge” about the planet it occupies
  • One exception to this pattern-based knowledge is the weather, whose underlying physics governs how the atmosphere moves and adjusts. Because we understand the physics, we can replicate the atmosphere with computer models.
  • But farmers need to think a season or more ahead. So do infrastructure planners as they design new energy and water systems
  • The intermediate time period is our big challenge. Without substantial scientific breakthroughs, we will remain reliant on pattern-based methods for time periods between a month and a decade. The problem is, as the planet warms, these patterns will become increasingly difficult to discern.
  • The oceans, which play a major role in global weather patterns, will also see substantial changes as global temperatures rise. Ocean currents and circulation patterns evolve on time scales of decades and longer, and fisheries change in response. We lack reliable, physics-based models to tell us how this occurs.
  • Our grandchildren could grow up knowing less about the planet than we do today. This is not a legacy we want to leave them. Yet we are on the verge of ensuring this happens.
maxwellokolo

Sugary Drinks Tied to Accelerated Brain Aging - 0 views

  •  
    Compared with those who drank no sugary drinks, those who drank one or two a day had a reduced brain volume equivalent to 1.6 years of normal aging, and lower memory scores equivalent to 5.8 years of aging.
lucieperloff

Why Older People Managed to Stay Happier Through the Pandemic - The New York Times - 0 views

  • that age and emotional well-being tend to increase together, as a rule, even as mental acuity and physical health taper off.
  • that age and emotional well-being tend to increase together, as a rule, even as mental acuity and physical health taper off.
    • lucieperloff
       
      this makes sense bc older people have had more experiences to learn how to cope with stress
  • Do people somehow develop better coping skills as they age?
  • ...4 more annotations...
  • Yet their moods remained elevated, on average, compared with those in younger generations, the survey data showed — despite the fact that both groups reported the same stress levels.
  • Older people, especially those with some resources, have more ability than younger adults to soften the edges of a day, by paying for delivery, hiring help, staying comfortably homebound and — crucially — doing so without young children underfoot.
  • when people are young, their goals and motives are focused on gaining skills and taking chances, to prepare for opportunities the future may hold.
  • They have come to accept themselves for who they are, rather than who they’re supposed to become.
knudsenlu

Why Asia Is Fast Becoming A Global Leader In Neuroscience - 0 views

  • The world is on the cusp of redefining brain aging. Asia Pacific, known for countries with rapidly aging populations, has unimaginable potential to lead the charge in research and innovation for better, younger, healthier brains.
  • The human brain contains approximately 80 billion neurons, has trillions of connections, is estimated to store up to 2.5 petabytes of memory, and for more than 3000 years we’ve asked questions about how it works.
  • One thing we have realized is that as exceptional as our brain is, the biological process of aging, also occurs in the brain.
  • ...2 more annotations...
  • Asian scientists, laboratories and companies are fast becoming a major driver for this innovation – perhaps not a surprise as we are already facing the challenges of an aging population such as growing economic, societal and personal costs.
  • It is an exciting time to be a neuroscientist and we are closer now than ever to developing breakthrough treatments and technologies that have the potential to make real societal impacts. Neuroscience has advanced more in the last 10 years than the previous 50, and with new perspectives starting to find a voice, I’m quite excited to see what the next 10 years will hold.
Javier E

Inequality and the Modern Culture of Celebrity - NYTimes.com - 0 views

  • The Depression that ended Fitzgerald’s Jazz Age yielded to a new order that might be called the Roosevelt Republic. In the quarter-century after World War II, the country established collective structures, not individual monuments, that channeled the aspirations of ordinary people: state universities, progressive taxation, interstate highways, collective bargaining, health insurance for the elderly, credible news organizations.
  • One virtue of those hated things called bureaucracies is that they oblige everyone to follow a common set of rules, regardless of station or background; they are inherently equalizing.
  • Our age is lousy with celebrities. They can be found in every sector of society, including ones that seem less than glamorous
  • ...2 more annotations...
  • This new kind of celebrity is the ultimate costume ball, far more exclusive and decadent than even the most potent magnates of Hollywood’s studio era could have dreamed up.
  • after decades of widening income gaps, unequal distributions of opportunity and reward, and corroding public institutions, we have gone back to Gatsby’s time — or something far more perverse. The celebrity monuments of our age have grown so huge that they dwarf the aspirations of ordinary people, who are asked to yield their dreams to the gods: to flash their favorite singer’s corporate logo at concerts, to pour open their lives (and data) on Facebook, to adopt Apple as a lifestyle. We know our stars aren’t inviting us to think we can be just like them. Their success is based on leaving the rest of us behind.
aliciathompson1

Who Are Donald Trump's Supporters? - The Atlantic - 0 views

  • The first story about the typical Trump buyer was simple: These were poorly informed voters, swept up by a modern circus act orchestrated by a mass-media-age P. T. Barnum with arguably worse hair. But Trump’s appeal has proven to be more than a passing fad.
  • Back in December, a Washington Post analysis found that Trump's support skewed male, white, and poor.
  • The single best predictor of Trump support in the GOP primary is the absence of a college degree.
  • ...3 more annotations...
  • If there were one question to identify a Trump supporter if you knew nothing else about him, what might it be? “Are you a middle-aged white man who hasn’t graduated from college?” might be a good one. But according to a survey from RAND Corporation, there is one that’s even better: Do you feel voiceless?
  • They Want to Wage an Interior War Against Outsiders
  • They Live in Parts of the Country With Racial Resentment
Javier E

Why Our Children Don't Think There Are Moral Facts - NYTimes.com - 1 views

  • I already knew that many college-aged students don’t believe in moral facts.
  • the overwhelming majority of college freshman in their classrooms view moral claims as mere opinions that are not true or are true only relative to a culture.
  • where is the view coming from?
  • ...32 more annotations...
  • the Common Core standards used by a majority of K-12 programs in the country require that students be able to “distinguish among fact, opinion, and reasoned judgment in a text.”
  • So what’s wrong with this distinction and how does it undermine the view that there are objective moral facts?
  • For example, many people once thought that the earth was flat. It’s a mistake to confuse truth (a feature of the world) with proof (a feature of our mental lives)
  • Furthermore, if proof is required for facts, then facts become person-relative. Something might be a fact for me if I can prove it but not a fact for you if you can’t. In that case, E=MC2 is a fact for a physicist but not for me.
  • worse, students are taught that claims are either facts or opinions. They are given quizzes in which they must sort claims into one camp or the other but not both. But if a fact is something that is true and an opinion is something that is believed, then many claims will obviously be both
  • How does the dichotomy between fact and opinion relate to morality
  • Kids are asked to sort facts from opinions and, without fail, every value claim is labeled as an opinion.
  • Here’s a little test devised from questions available on fact vs. opinion worksheets online: are the following facts or opinions? — Copying homework assignments is wrong. — Cursing in school is inappropriate behavior. — All men are created equal. — It is worth sacrificing some personal liberties to protect our country from terrorism. — It is wrong for people under the age of 21 to drink alcohol. — Vegetarians are healthier than people who eat meat. — Drug dealers belong in prison.
  • The answer? In each case, the worksheets categorize these claims as opinions. The explanation on offer is that each of these claims is a value claim and value claims are not facts. This is repeated ad nauseum: any claim with good, right, wrong, etc. is not a fact.
  • In summary, our public schools teach students that all claims are either facts or opinions and that all value and moral claims fall into the latter camp. The punchline: there are no moral facts. And if there are no moral facts, then there are no moral truths.
  • It should not be a surprise that there is rampant cheating on college campuses: If we’ve taught our students for 12 years that there is no fact of the matter as to whether cheating is wrong, we can’t very well blame them for doing so later on.
  • If it’s not true that it’s wrong to murder a cartoonist with whom one disagrees, then how can we be outraged? If there are no truths about what is good or valuable or right, how can we prosecute people for crimes against humanity? If it’s not true that all humans are created equal, then why vote for any political system that doesn’t benefit you over others?
  • the curriculum sets our children up for doublethink. They are told that there are no moral facts in one breath even as the next tells them how they ought to behave.
  • Our children deserve a consistent intellectual foundation. Facts are things that are true. Opinions are things we believe. Some of our beliefs are true. Others are not. Some of our beliefs are backed by evidence. Others are not.
  • Value claims are like any other claims: either true or false, evidenced or not.
  • The hard work lies not in recognizing that at least some moral claims are true but in carefully thinking through our evidence for which of the many competing moral claims is correct.
  • Moral truths are not the same as scientific truths or mathematical truths. Yet they may still be used a guiding principle for our individual lives as well as our laws.But there is equal danger of giving moral judgments the designation of truth as there is in not doing so. Many people believe that abortion is murder on the same level as shooting someone with a gun. But many others do not. So is it true that abortion is murder?Moral principles can become generally accepted and then form the basis for our laws. But many long accepted moral principles were later rejected as being faulty. "Separate but equal" is an example. Judging homosexual relationships as immoral is another example.
  • Whoa! That Einstein derived an equation is a fact. But the equation represents a theory that may have to be tweaked at some point in the future. It may be a fact that the equation foretold the violence of atomic explosions, but there are aspects of nature that elude the equation. Remember "the theory of everything?"
  • Here is a moral fact, this is a sermon masquerading as a philosophical debate on facts, opinions and truth. This professor of religion is asserting that the government via common core is teaching atheism via the opinion vs fact.He is arguing, in a dishonest form, that public schools should be teaching moral facts. Of course moral facts is code for the Ten Commandments.
  • As a fourth grade teacher, I try to teach students to read critically, including distinguishing between facts and opinions as they read (and have been doing this long before the Common Core arrived, by the way). It's not always easy for children to grasp the difference. I can only imagine the confusion that would ensue if I introduced a third category -- moral "facts" that can't be proven but are true nonetheless!
  • horrible acts occur not because of moral uncertainty, but because people are too sure that their views on morality are 100% true, and anyone who fails to recognize and submit themselves are heathens who deserve death.I can't think of any case where a society has suffered because people are too thoughtful and open-minded to different perspectives on moral truth.In any case, it's not an elementary school's job to teach "moral truths."
  • The characterization of moral anti-realism as some sort of fringe view in philosophy is misleading. Claims that can be true or false are, it seems, 'made true' by features of the world. It's not clear to many in philosophy (like me) just what features of the world could make our moral claims true. We are more likely to see people's value claims as making claims about, and enforcing conformity to, our own (contingent) social norms. This is not to hold, as Mr. McBrayer seems to think follows, that there are no reasons to endorse or criticize these social norms.
  • This is nonsense. Giving kids the tools to distinguish between fact and opinion is hard enough in an age when Republicans actively deny reality on Fox News every night. The last thing we need is to muddy their thinking with the concept of "moral facts."A fact is a belief that everyone _should_ agree upon because it is observable and testable. Morals are not agreed upon by all. Consider the hot button issue of abortion.
  • Truthfully, I'm not terribly concerned that third graders will end up taking these lessons in the definition of fact versus opinion to the extremes considered here, or take them as a license to cheat. That will come much later, when they figure out, as people always have, what they can get a way with. But Prof. McBrayer, with his blithe expectation that all the grownups know that there moral "facts"? He scares the heck out of me.
  • I've long chafed at the language of "fact" v. "opinion", which is grounded in a very particular, limited view of human cognition. In my own ethics courses, I work actively to undermine the distinction, focusing instead on considered judgment . . . or even more narrowly, on consideration itself. (See http://wp.me/p5Ag0i-6M )
  • The real waffle here is the very concept of "moral facts." Our statements of values, even very important ones are, obviously, not facts. Trying to dress them up as if they are facts, to me, argues for a pretty serious moral weakness on the part of those advancing the idea.
  • Our core values are not important because they are facts. They are important because we collectively hold them and cherish them. To lean on the false crutch of "moral facts" to admit the weakness of your own moral convictions.
  • I would like to believe that there is a core of moral facts/values upon which all humanity can agree, but it would be tough to identify exactly what those are.
  • For the the ancient philosophers, reality comprised the Good, the True, and the Beautiful (what we might now call ethics, science and art), seeing these as complementary and inseparable, though distinct, realms. With the ascendency of science in our culture as the only valid measure of reality to the detriment of ethics and art (that is, if it is not observable and provable, it is not real), we have turned the good and the beautiful into mere "social constructs" that have no validity on their own. While I am sympathetic in many ways with Dr. McBrayer's objections, I think he falls into the trap of discounting the Good and The Beautiful as valid in and of themselves, and tries, instead, to find ways to give them validity through the True. I think his argument would have been stronger had he used the language of validity rather than the language of truth. Goodness, Truth and Beauty each have their own validity, though interdependent and inseparable. When we artificially extract one of these and give it primacy, we distort reality and alienate ourselves from it.
  • Professor McBrayer seems to miss the major point of the Common Core concern: can students distinguish between premises based on (reasonably construed) fact and premises based on emotion when evaluating conclusions? I would prefer that students learn to reason rather than be taught moral 'truth' that follows Professor McBrayer's logic.
  • Moral issues cannot scientifically be treated on the level that Prof. McBrayer is attempting to use in this column: true or false, fact or opinion or both. Instead, they should be treated as important characteristics of the systematic working of a society or of a group of people in general. One can compare the working of two groups of people: one in which e.g. cheating and lying is acceptable, and one in which they are not. One can use historical or model examples to show the consequences and the working of specific systems of morals. I think that this method - suitably adjusted - can be used even in second grade.
  • Relativism has nothing to do with liberalism. The second point is that I'm not sure it does all that much harm, because I have yet to encounter a student who thought that he or she had to withhold judgment on those who hold opposing political views!
‹ Previous 21 - 40 of 562 Next › Last »
Showing 20 items per page