Skip to main content

Home/ TOK Friends/ Group items tagged breakthrough

Rss Feed Group items tagged

Ellie McGinnis

The 50 Greatest Breakthroughs Since the Wheel - James Fallows - The Atlantic - 0 views

  • Some questions you ask because you want the right answer. Others are valuable because no answer is right; the payoff comes from the range of attempts.
  • That is the diversity of views about the types of historical breakthroughs that matter, with a striking consensus on whether the long trail of innovation recorded here is now nearing its end.
  • The clearest example of consensus was the first item on the final compilation, the printing press
  • ...27 more annotations...
  • Leslie Berlin, a historian of business at Stanford, organized her nominations not as an overall list but grouped into functional categories.
  • Innovations that expand the human intellect and its creative, expressive, and even moral possibilities.
  • Innovations that are integral to the physical and operating infrastructure of the modern world
  • Innovations that enabled the Industrial Revolution and its successive waves of expanded material output
  • Innovations extending life, to use Leslie Berlin’s term
  • Innovations that allowed real-time communication beyond the range of a single human voice
  • Innovations in the physical movement of people and goods.
  • Organizational breakthroughs that provide the software for people working and living together in increasingly efficient and modern ways
  • For our era, the major problems that technology has helped cause, and that faster innovation may or may not correct, are environmental, demographic, and socioeconomic.
  • Any collection of 50 breakthroughs must exclude 50,000 more.
  • We learn, finally, why technology breeds optimism, which may be the most significant part of this exercise.
  • Popular culture often lionizes the stars of discovery and invention
  • Finally, and less prominently than we might have found in 1950 or 1920—and less prominently than I initially expected—we have innovations in killing,
  • people who have thought deeply about innovation’s sources and effects, like our panelists, were aware of the harm it has done along with the good.
  • “Does innovation raise the wealth of the planet? I believe it does,” John Doerr, who has helped launch Google, Amazon, and other giants of today’s technology, said. “But technology left to its own devices widens rather than narrows the gap between the rich and the poor.”
  • Are today’s statesmen an improvement over those of our grandparents’ era? Today’s level of public debate? Music, architecture, literature, the fine arts—these and other manifestations of world culture continually change, without necessarily improving. Tolstoy and Dostoyevsky, versus whoever is the best-selling author in Moscow right now?
  • The argument that a slowdown might happen, and that it would be harmful if it did, takes three main forms.
  • Some societies have closed themselves off and stopped inventing altogether:
  • By failing to move forward, they inevitably moved backward relative to their rivals and to the environmental and economic threats they faced. If the social and intellectual climate for innovation sours, what has happened before can happen again.
  • visible slowdown in the pace of solutions that technology offers to fundamental problems.
  • a slowdown in, say, crop yields or travel time is part of a general pattern of what economists call diminishing marginal returns. The easy improvements are, quite naturally, the first to be made; whatever comes later is slower and harder.
  • America’s history as a nation happens to coincide with a rare moment in technological history now nearing its end. “There was virtually no economic growth before 1750,” he writes in a recent paper.
  • “We can be concerned about the last 1 percent of an environment for innovation, but that is because we take everything else for granted,” Leslie Berlin told me.
  • This reduction in cost, he says, means that the next decade should be a time of “amazing advances in understanding the genetic basis of disease, with especially powerful implications for cancer.”
  • the very concept of an end to innovation defied everything they understood about human inquiry. “If you look just at the 20th century, the odds against there being any improvement in living standards are enormous,”
  • “Two catastrophic world wars, the Cold War, the Depression, the rise of totalitarianism—it’s been one disaster after another, a sequence that could have been enough to sink us back into barbarism. And yet this past half century has been the fastest-ever time of technological growth. I see no reason why that should be slowing down.”
  • “I am a technological evolutionist,” he said. “I view the universe as a phase-space of things that are possible, and we’re doing a random walk among them. Eventually we are going to fill the space of everything that is possible.”
sissij

When Did 'Ambition' Become a Dirty Word? - The New York Times - 0 views

  • but is instead a stark, black-and-white video, a public service announcement that takes on a thorny issue that dominated the last presidential campaign and has divided people on the right and left.
  • “Embrace Ambition,”
  • “I can think of a lot of dirty words,” Ms. Witherspoon says. “Ambition is not one of them.”
  • ...6 more annotations...
  • Nevertheless, she seemed to choose her words carefully as she spoke about the campaign.
  • she wanted to get away from the idea that this project was politically motivated, or anti-Trump.
  • But the issue of ambition, and the way it is used to defame women, is nevertheless personal to her.
  • This was confusing to Ms. Burch, who never saw herself as being a particularly threatening person.
  • “I do it, too,” she said. “I’m guilty of all of it.”
  • And the word “feminist” began to shed its Bella Abzug and Betty Friedan connotations, as women like Madonna went from saying they are “not feminists” but “humanists” to wearing T-shirts at anti-Trump events that had the word “feminist” emblazoned across the center.
  •  
    Sometimes, people are ashamed of their ambition. They are afraid to have a dream because they would not bear any failure. I think people don't dare to dream big now. I think it might be because of the concept of economics we have in the recent years. Most people follow the efficiency rule so nobody is willing to risk and make a revolution. Putting on big stake is not the most efficient choice in economics. Although it tells us it is okay to be greedy, but it limited us to the model and make us less willingly to make a breakthrough. --Sissi (3/2/2017)
Javier E

Crowd-Sourcing Brain Research Leads to Breakthrough - NYTimes.com - 0 views

  • these guys finally did what needed to be done to take a real stab at merging imaging and genomics
  • Brain imaging studies are expensive and, as a result, far too small to reliably tease out the effects of common gene variations. These effects tend to be tiny, for one thing, and difficult to distinguish from the background “noise” of other influences. And brain imaging is notoriously noisy: not only does overall brain size vary from person to person, for instance, but so do the sizes of specialized brain regions like the hippocampus, which is critical for memory formation.
  • persuaded research centers around the world to pool their resources and create one large database. It included genetic and extensive brain imaging results from about 21,000 people. The team then analyzed the collective data to see whether any genes were linked to brain structure.
Javier E

Our Machine Masters - NYTimes.com - 0 views

  • the smart machines of the future won’t be humanlike geniuses like HAL 9000 in the movie “2001: A Space Odyssey.” They will be more modest machines that will drive your car, translate foreign languages, organize your photos, recommend entertainment options and maybe diagnose your illnesses. “Everything that we formerly electrified we will now cognitize,” Kelly writes. Even more than today, we’ll lead our lives enmeshed with machines that do some of our thinking tasks for us.
  • This artificial intelligence breakthrough, he argues, is being driven by cheap parallel computation technologies, big data collection and better algorithms. The upshot is clear, “The business plans of the next 10,000 start-ups are easy to forecast: Take X and add A.I.”
  • Two big implications flow from this. The first is sociological. If knowledge is power, we’re about to see an even greater concentration of power.
  • ...14 more annotations...
  • in 2001, the top 10 websites accounted for 31 percent of all U.S. page views, but, by 2010, they accounted for 75 percent of them.
  • The Internet has created a long tail, but almost all the revenue and power is among the small elite at the head.
  • Advances in artificial intelligence will accelerate this centralizing trend. That’s because A.I. companies will be able to reap the rewards of network effects. The bigger their network and the more data they collect, the more effective and attractive they become.
  • As a result, our A.I. future is likely to be ruled by an oligarchy of two or three large, general-purpose cloud-based commercial intelligences.”
  • engineers at a few gigantic companies will have vast-though-hidden power to shape how data are collected and framed, to harvest huge amounts of information, to build the frameworks through which the rest of us make decisions and to steer our choices. If you think this power will be used for entirely benign ends, then you have not read enough history.
  • The second implication is philosophical. A.I. will redefine what it means to be human. Our identity as humans is shaped by what machines and other animals can’t do
  • On the other hand, machines cannot beat us at the things we do without conscious thinking: developing tastes and affections, mimicking each other and building emotional attachments, experiencing imaginative breakthroughs, forming moral sentiments.
  • For the last few centuries, reason was seen as the ultimate human faculty. But now machines are better at many of the tasks we associate with thinking — like playing chess, winning at Jeopardy, and doing math.
  • In the age of smart machines, we’re not human because we have big brains. We’re human because we have social skills, emotional capacities and moral intuitions.
  • I could paint two divergent A.I. futures, one deeply humanistic, and one soullessly utilitarian.
  • In the cold, utilitarian future, on the other hand, people become less idiosyncratic. If the choice architecture behind many decisions is based on big data from vast crowds, everybody follows the prompts and chooses to be like each other. The machine prompts us to consume what is popular, the things that are easy and mentally undemanding.
  • In this future, there is increasing emphasis on personal and moral faculties: being likable, industrious, trustworthy and affectionate. People are evaluated more on these traits, which supplement machine thinking, and not the rote ones that duplicate it
  • In the humanistic one, machines liberate us from mental drudgery so we can focus on higher and happier things. In this future, differences in innate I.Q. are less important. Everybody has Google on their phones so having a great memory or the ability to calculate with big numbers doesn’t help as much.
  • In the current issue of Wired, the technology writer Kevin Kelly says that we had all better get used to this level of predictive prowess. Kelly argues that the age of artificial intelligence is finally at hand.
dpittenger

3-D-printed organs are on the way - Nov. 4, 2014 - 0 views

  • Add one more to the growing list of 3-D-printed products: human organs.
  • Within the next few years, Renard says 3-D-printed tissues could also be used in patient treatment, to replace small parts or organs or encourage cell regeneration.
  • Once the cells have been printed in the right arrangement, they begin to signal to one another, fuse and organize themselves into a collective system.
  • ...1 more annotation...
  • That hasn't stopped scientists from trying, however. Harvard researchers are at work trying to print functioning human kidneys, while a team at the University of Louisville is trying to produce a 3-D-printed heart.
  •  
    Scientists are creating organs using 3d printers. This is a big scientific breakthrough and changes how we think about biology.
pier-paolo

Opinion | Beyond the Brain - The New York Times - 0 views

  • Somebody makes an important scientific breakthrough, which explains a piece of the world. But then people get caught up in the excitement of this breakthrough and try to use it to explain everything.
  • This is what’s happening right now with neuroscience. The field is obviously incredibly important and exciting. From personal experience, I can tell you that you get captivated by it and sometimes go off to extremes, as if understanding the brain is the solution to understanding all thought and behavior.
  • Neuroscience will replace psychology and other fields as the way to understand action.
  • ...7 more annotations...
  • The brain is not the mind. It is probably impossible to look at a map of brain activity and predict or even understand the emotions, reactions, hopes and desires of the mind.
  • But the amygdala lights up during fear, happiness, novelty, anger or sexual arousal (at least in women). The insula plays a role in processing trust, insight, empathy, aversion and disbelief. So what are you really looking at?
  • Then there is the problem that one action can arise out of many different brain states and the same event can trigger many different brain reactions.
  • A glass of water may be more meaningful to you when you are dying of thirst than when you are not. Your lover means more than your friend. It’s as hard to study neurons and understand the flavors of meaning as it is to study Shakespeare’s spelling and understand the passions aroused by Macbeth.
  • Right now we are compelled to rely on different disciplines to try to understand behavior on multiple levels, with inherent tensions between them. Some people want to reduce that ambiguity by making one discipline all-explaining. They want to eliminate the confusing ambiguity of human freedom by reducing everything to material determinism.
  • An important task these days is to harvest the exciting gains made by science and data while understanding the limits of science and data.
  • The brain is not the mind.
ilanaprincilus06

Meet the neuroscientist shattering the myth of the gendered brain | Science | The Guardian - 0 views

  • Whatever its sex, this baby’s future is predetermined by the entrenched belief that males and females do all kinds of things differently, better or worse, because they have different brains.
  • how vital it is, how life-changing, that we finally unpack – and discard – the sexist stereotypes and binary coding that limit and harm us.
  • she is out in the world, debunking the “pernicious” sex differences myth: the idea that you can “sex” a brain or that there is such a thing as a male brain and a female brain.
  • ...18 more annotations...
  • since the 18th century “when people were happy to spout off about what men and women’s brains were like – before you could even look at them. They came up with these nice ideas and metaphors that fitted the status quo and society, and gave rise to different education for men and women.”
  • she couldn’t find any beyond the negligible, and other research was also starting to question the very existence of such differences. For example, once any differences in brain size were accounted for, “well-known” sex differences in key structures disappeared.
  • Are there any significant differences based on sex alone? The answer, she says, is no.
  • “The idea of the male brain and the female brain suggests that each is a characteristically homogenous thing and that whoever has got a male brain, say, will have the same kind of aptitudes, preferences and personalities as everyone else with that ‘type’ of brain. We now know that is not the case.
  • ‘Forget the male and female brain; it’s a distraction, it’s inaccurate.’ It’s possibly harmful, too, because it’s used as a hook to say, well, there’s no point girls doing science because they haven’t got a science brain, or boys shouldn’t be emotional or should want to lead.”
  • The next question was, what then is driving the differences in behaviour between girls and boys, men and women?
  • “that the brain is moulded from birth onwards and continues to be moulded through to the ‘cognitive cliff’ in old age when our grey cells start disappearing.
  • the brain is much more a function of experiences. If you learn a skill your brain will change, and it will carry on changing.”
  • The brain is also predictive and forward-thinking in a way we had never previously realised.
  • The rules will change how the brain works and how someone behaves.” The upshot of gendered rules? “The ‘gender gap’ becomes a self-fulfilling prophecy.”
  • The brain is a biological organ. Sex is a biological factor. But it is not the sole factor; it intersects with so many variables.”
  • Letting go of age-old certainties is frightening, concedes Rippon, who is both optimistic about the future, and fearful for it.
  • On the plus side, our plastic brains are good learners. All we need to do is change the life lessons.
  • One major breakthrough in recent years has been the realisation that, even in adulthood, our brains are continually being changed, not just by the education we receive, but also by the jobs we do, the hobbies we have, the sports we play.
  • Once we acknowledge that our brains are plastic and mouldable, then the power of gender stereotypes becomes evident.
  • Beliefs about sex differences (even if ill-founded) inform stereotypes, which commonly provide just two labels – girl or boy, female or male – which, in turn, historically carry with them huge amounts of “contents assured” information and save us having to judge each individual on their own merits
  • With input from exciting breakthroughs in neuroscience, the neat, binary distinctiveness of these labels is being challenged – we are coming to realise that nature is inextricably entangled with nurture.
  • The 21st century is not just challenging the old answers – it is challenging the question itself.
Javier E

UK mathematician wins richest prize in academia | Mathematics | The Guardian - 0 views

  • Martin Hairer, an Austrian-British researcher at Imperial College London, is the winner of the 2021 Breakthrough prize for mathematics, an annual $3m (£2.3m) award that has come to rival the Nobels in terms of kudos and prestige.
  • Hairer landed the prize for his work on stochastic analysis, a field that describes how random effects turn the maths of things like stirring a cup of tea, the growth of a forest fire, or the spread of a water droplet that has fallen on a tissue into a fiendishly complex problem.
  • His major work, a 180-page treatise that introduced the world to “regularity structures”, so stunned his colleagues that one suggested it must have been transmitted to Hairer by a more intelligent alien civilisation.
  • ...3 more annotations...
  • After dallying with physics at university, Hairer moved into mathematics. The realisation that ideas in theoretical physics can be overturned and swiftly consigned to the dustbin did not appeal. “I wouldn’t really want to put my name to a result that could be superseded by something else three years later,” he said. “In mathematics, if you obtain a result then that is it. It’s the universality of mathematics, you discover absolute truths.”
  • Hairer’s expertise lies in stochastic partial differential equations, a branch of mathematics that describes how randomness throws disorder into processes such as the movement of wind in a wind tunnel or the creeping boundary of a water droplet landing on a tissue. When the randomness is strong enough, solutions to the equations get out of control. “In some cases, the solutions fluctuate so wildly that it is not even clear what the equation meant in the first place,” he said.
  • With the invention of regularity structures, Hairer showed how the infinitely jagged noise that threw his equations into chaos could be reframed and tamed.
Javier E

If 'permacrisis' is the word of 2022, what does 2023 have in store for our me... - 0 views

  • the Collins English Dictionary has come to a similar conclusion about recent history. Topping its “words of the year” list for 2022 is permacrisis, defined as an “extended period of insecurity and instability”. This new word fits a time when we lurch from crisis to crisis and wreckage piles upon wreckage
  • The word permacrisis is new, but the situation it describes is not. According to the German historian Reinhart Koselleck we have been living through an age of permanent crisis for at least 230 years
  • During the 20th century, the list got much longer. In came existential crises, midlife crises, energy crises and environmental crises. When Koselleck was writing about the subject in the 1970s, he counted up more than 200 kinds of crisis we could then face
  • ...20 more annotations...
  • Koselleck observes that prior to the French revolution, a crisis was a medical or legal problem but not much more. After the fall of the ancien regime, crisis becomes the “structural signature of modernity”, he writes. As the 19th century progressed, crises multiplied: there were economic crises, foreign policy crises, cultural crises and intellectual crises.
  • When he looked at 5,000 creative individuals over 127 generations in European history, he found that significant creative breakthroughs were less likely during periods of political crisis and instability.
  • Victor H Mair, a professor of Chinese literature at the University of Pennsylvania, points out that in fact the Chinese word for crisis, wēijī, refers to a perilous situation in which you should be particularly cautious
  • “Those who purvey the doctrine that the Chinese word for ‘crisis’ is composed of elements meaning ‘danger’ and ‘opportunity’ are engaging in a type of muddled thinking that is a danger to society,” he writes. “It lulls people into welcoming crises as unstable situations from which they can benefit.” Revolutionaries, billionaires and politicians may relish the chance to profit from a crisis, but most people world prefer not to have a crisis at all.
  • A common folk theory is that times of great crisis also lead to great bursts of creativity.
  • The first world war sparked the growth of modernism in painting and literature. The second fuelled innovations in science and technology. The economic crises of the 1970s and 80s are supposed to have inspired the spread of punk and the creation of hip-hop
  • psychologists have also found that when we are threatened by a crisis, we become more rigid and locked into our beliefs. The creativity researcher Dean Simonton has spent his career looking at breakthroughs in music, philosophy, science and literature. He has found that during periods of crisis, we actually tend to become less creative.
  • psychologists have found that it is what they call “malevolent creativity” that flourishes when we feel threatened by crisis.
  • during moments of significant crisis, the best leaders are able to create some sense of certainty and a shared fate amid the seas of change.
  • These are innovations that tend to be harmful – such as new weapons, torture devices and ingenious scams.
  • A 2019 study which involved observing participants using bricks, found that those who had been threatened before the task tended to come up with more harmful uses of the bricks (such as using them as weapons) than people who did not feel threatened
  • Students presented with information about a threatening situation tended to become increasingly wary of outsiders, and even begin to adopt positions such as an unwillingness to support LGBT people afterwards.
  • during moments of crisis – when change is really needed – we tend to become less able to change.
  • When we suffer significant traumatic events, we tend to have worse wellbeing and life outcomes.
  • , other studies have shown that in moderate doses, crises can help to build our sense of resilience.
  • we tend to be more resilient if a crisis is shared with others. As Bruce Daisley, the ex-Twitter vice-president, notes: “True resilience lies in a feeling of togetherness, that we’re united with those around us in a shared endeavour.”
  • Crises are like many things in life – only good in moderation, and best shared with others
  • The challenge our leaders face during times of overwhelming crisis is to avoid letting us plunge into the bracing ocean of change alone, to see if we sink or swim. Nor should they tell us things are fine, encouraging us to hide our heads in the san
  • Waking up each morning to hear about the latest crisis is dispiriting for some, but throughout history it has been a bracing experience for others. In 1857, Friedrich Engels wrote in a letter that “the crisis will make me feel as good as a swim in the ocean”. A hundred years later, John F Kennedy (wrongly) pointed out that in the Chinese language, the word “crisis” is composed of two characters, “one representing danger, and the other, opportunity”. More recently, Elon Musk has argued “if things are not failing, you are not innovating enough”.
  • This means people won’t feel an overwhelming sense of threat. It also means people do not feel alone. When we feel some certainty and common identity, we are more likely to be able to summon the creativity, ingenuity and energy needed to change things.
Emily Horwitz

UK, Japan scientists win Nobel for stem cell breakthroughs | Reuters - 0 views

  • Scientists from Britain and Japan shared a Nobel Prize on Monday for the discovery that adult cells can be transformed back into embryo-like stem cells that may one day regrow tissue in damaged brains, hearts or other organs.
  • discovered ways to create tissue that would act like embryonic cells, without the need to harvest embryos.
  • "These groundbreaking discoveries have completely changed our view of the development and specialization of cells," the Nobel Assembly at Stockholm's Karolinska Institute said.
  • ...6 more annotations...
  • big hope for stem cells is that they can be used to replace damaged tissue in everything from spinal cord injuries to Parkinson's disease.
  • Scientists once thought it was impossible to turn adult tissue back into stem cells, which meant that new stem cells could only be created by harvesting embryos - a practice that raised ethical qualms in some countries and also means that implanted cells might be rejected by the body.
  • The new stem cells are known as "induced pluripotency stem cells", or iPS cells.
  • "We would like to be able to find a way of obtaining spare heart or brain cells from skin or blood cells. The important point is that the replacement cells need to be from the same individual, to avoid problems of rejection and hence of the need for immunosuppression."
  • Thomas Perlmann, Nobel Committee member and professor of Molecular Development Biology at the Karolinska Institute said: "Thanks to these two scientists, we know now that development is not strictly a one-way street."
  • "You can't take out a large part of the heart or the brain or so to study this, but now you can take a cell from for example the skin of the patient, reprogram it, return it to a pluripotent state, and then grow it in a laboratory," he said.
aliciathompson1

Why we should have seen Trump coming - BBC News - 0 views

  • Christie's blessing came as a bolt from the blue, and taught us once more to expect the unexpected. But shouldn't the establishment - and us in the media, for that matter - have seen the billionaire coming? After all, for years the Republican standard bearers have been vulnerable to a challenge from an anti-establishment candidate.
  • The most obvious reason for the decline of the Republican establishment has been the rise of anti-establishment adversaries. The Tea Party, an insurgent grassroots movement that emerged after Barack Obama's inauguration, has posed the most serious threat.
  • However, most of us made the mistake of interpreting the results of the congressional mid-term elections as a major setback for insurgents, because they failed to make more breakthroughs.
  • ...2 more annotations...
  • Revulsion right now of the permanent political class and party elites seems to be a global phenomenon, but in America it is particularly pronounced, on the left as well as the right.
  • But an anti-establishment figure like Donald Trump would not have become so strong had not the party establishment become so weak. The GOP, the Grand Old Party, has been ripe for a takeover for years.
tornekm

DNA 'tape recorder' to trace cell history - BBC News - 0 views

  • The technique is being hailed as a breakthrough in understanding how the trillions of complex cells in a body are descended from a single egg.
  • The human body has around 40 trillion cells, each with a highly specialised function. Yet each can trace its history back to the same starting point - a fertilised egg.
  • The molecular tape recorder developed by Prof Shendure's team at the University of Washington in Seattle, US, is a length of DNA inserted into the genome that contains a series of edit points which can be changed throughout an organism's life.
  • ...1 more annotation...
  • "Cancers develop by a lineage, too," Alex Schier told the BBC. "Our technique can be used to follow these lineages during cancer formation - to tell us the relationships of cells within a tumour, and between the original tumour and secondary tumours formed by metastasis."
Javier E

In This Snapchat Campaign, Election News Is Big and Then It's Gone - The New York Times - 1 views

  • Every modern presidential election is at least in part defined by the cool new media breakthrough of its moment.
  • In 2000, there was email, and by golly was that a big change from the fax. The campaigns could get their messages in front of print and cable news reporters — who could still dominate the campaign narrative — at will,
  • Then 2008: Facebook made it that much easier for campaigns to reach millions of people directly,
  • ...17 more annotations...
  • The 2004 campaign was the year of the “Web log,” or blog, when mainstream reporters and campaigns officially began losing any control they may have had over political new
  • Marco Rubio’s campaign marched into the election season ready to fight the usual news-cycle-by-news-cycle skirmishes. It was surprised to learn that, lo and behold, “There was no news cycle — everything was one big fire hose,” Alex Conant, a senior Rubio strategist, told me. “News was constantly breaking and at the end of the day hardly anything mattered. Things would happen; 24 hours later, everyone was talking about something else.”
  • Snapchat represents a change to something else: the longevity of news, how durably it keeps in our brain cells and our servers.
  • Snapchat is recording the here and the now, playing for today. Tomorrow will bring something new that renders today obsolete. It’s a digital Tibetan sand painting made in the image of the millennial mind.
  • Snapchat executives say they set up the app this way because this is what their tens of millions of younger users want; it’s how they live.
  • They can’t possibly have enough bandwidth to process all the incoming information and still dwell on what already was, can they?
  • Experienced strategists and their candidates, who could always work through their election plans methodically — promoting their candidacies one foot in front of the other, adjusting here and there for the unexpected — suddenly found that they couldn’t operate the way they always did.
  • The question this year has been whether 2016 will be the “Snapchat election,
  • Then there was Jeb Bush, expecting to press ahead by presenting what he saw as leading-edge policy proposals that would set off a prolonged back-and-forth. When Mr. Bush rolled out a fairly sweeping plan to upend the college loan system, the poor guy thought this was going to become a big thing.
  • It drew only modest coverage and was quickly buried by the latest bit from Donald Trump.
  • In this “hit refresh” political culture, damaging news does not have to stick around for long, either. The next development, good or bad, replaces it almost immediately.
  • Mr. Miller pointed to a recent episode in which Mr. Trump said a protester at a rally had “ties to ISIS,” after that protester charged the stage. No such ties existed. “He says ‘ISIS is attacking me’; this was debunked in eight minutes by Twitter,” Mr. Miller said. “Cable talked about it for three hours and it went away.”
  • “Hillary Clinton said that she was under sniper fire in Bosnia” — she wasn’t — “and that has stuck with her for 20 years,”
  • Mr. Trump has mastered this era of short attention spans in politics by realizing that if you’re the one regularly feeding the stream, you can forever move past your latest trouble, and hasten the mass amnesia.
  • It was with this in mind that The Washington Post ran an editorial late last week reminding its readers of some of Mr. Trump’s more outlandish statements and policy positions
  • The Post urged its readers to “remember” more than two dozen items from Mr. Trump’s record, including that he promised “to round up 11 million undocumented immigrants and deport them,” and “lied about President Obama’s birth certificate.”
  • as the media habits of the young drive everybody else’s, I’m reminded of that old saw about those who forget history. Now, what was I saying?
Javier E

Choose to Be Grateful. It Will Make You Happier. - The New York Times - 2 views

  • Building the best life does not require fealty to feelings in the name of authenticity, but rather rebelling against negative impulses and acting right even when we don’t feel like it. In a nutshell, acting grateful can actually make you grateful.
  • some people are just naturally more grateful than others. A 2014 article in the journal Social Cognitive and Affective Neuroscience identified a variation in a gene (CD38) associated with gratitude. Some people simply have a heightened genetic tendency to experience, in the researchers’ words, “global relationship satisfaction, perceived partner responsiveness and positive emotions (particularly love).” That is, those relentlessly positive people you know who seem grateful all the time may simply be mutants.
  • Evidence suggests that we can actively choose to practice gratitude — and that doing so raises our happiness.
  • ...11 more annotations...
  • , researchers in one 2003 study randomly assigned one group of study participants to keep a short weekly list of the things they were grateful for, while other groups listed hassles or neutral events. Ten weeks later, the first group enjoyed significantly greater life satisfaction than the others
  • acting happy, regardless of feelings, coaxes one’s brain into processing positive emotions. In one famous 1993 experiment, researchers asked human subjects to smile forcibly for 20 seconds while tensing facial muscles, notably the muscles around the eyes called the orbicularis oculi (which create “crow’s feet”). They found that this action stimulated brain activity associated with positive emotions.
  • gratitude stimulates the hypothalamus (a key part of the brain that regulates stress) and the ventral tegmental area (part of our “reward circuitry” that produces the sensation of pleasure).
  • In the slightly more elegant language of the Stoic philosopher Epictetus, “He is a man of sense who does not grieve for what he has not, but rejoices in what he has.”
  • In addition to building our own happiness, choosing gratitude can also bring out the best in those around us
  • when their competence was questioned, the subjects tended to lash out with aggression and personal denigration. When shown gratitude, however, they reduced the bad behavior. That is, the best way to disarm an angry interlocutor is with a warm “thank you.”
  • A new study in the Journal of Consumer Psychology finds evidence that people begin to crave sweets when they are asked to express gratitude.
  • There are concrete strategies that each of us can adopt. First, start with “interior gratitude,” the practice of giving thanks privately
  • he recommends that readers systematically express gratitude in letters to loved ones and colleagues. A disciplined way to put this into practice is to make it as routine as morning coffee. Write two short emails each morning to friends, family or colleagues, thanking them for what they do.
  • Finally, be grateful for useless things
  • think of the small, useless things you experience — the smell of fall in the air, the fragment of a song that reminds you of when you were a kid. Give thanks.
anonymous

Are search engines and the Internet hurting human memory? - Slate Magazine - 2 views

  • are we losing the power to retain knowledge? The short answer is: No. Machines aren’t ruining our memory. Advertisement The longer answer: It’s much, much weirder than that!
  • we’ve begun to fit the machines into an age-old technique we evolved thousands of years ago—“transactive memory.” That’s the art of storing information in the people around us.
  • frankly, our brains have always been terrible at remembering details. We’re good at retaining the gist of the information we encounter. But the niggly, specific facts? Not so much.
  • ...22 more annotations...
  • subjects read several sentences. When he tested them 40 minutes later, they could generally remember the sentences word for word. Four days later, though, they were useless at recalling the specific phrasing of the sentences—but still very good at describing the meaning of them.
  • When you’re an expert in a subject, you can retain new factoids on your favorite topic easily. This only works for the subjects you’re truly passionate about, though
  • They were, in a sense, Googling each other.
  • Wegner noticed that spouses often divide up memory tasks. The husband knows the in-laws' birthdays and where the spare light bulbs are kept; the wife knows the bank account numbers and how to program the TiVo
  • Together, they know a lot. Separately, less so.
  • Wegner suspected this division of labor takes place because we have pretty good "metamemory." We're aware of our mental strengths and limits, and we're good at intuiting the memory abilities of others.
  • We share the work of remembering, Wegner argued, because it makes us collectively smarter
  • The groups that scored highest on a test of their transactive memory—in other words, the groups where members most relied on each other to recall information—performed better than those who didn't use transactive memory. Transactive groups don’t just remember better: They also analyze problems more deeply, too, developing a better grasp of underlying principles.
  • Transactive memory works best when you have a sense of how your partners' minds work—where they're strong, where they're weak, where their biases lie. I can judge that for people close to me. But it's harder with digital tools, particularly search engines
  • "the thinking processes of the intimate dyad."
  • And as it turns out, this is what we’re doing with Google and Evernote and our other digital tools. We’re treating them like crazily memorious friends who are usually ready at hand. Our “intimate dyad” now includes a silicon brain.
  • When Sparrow tested the students, the people who knew the computer had saved the information were less likely to personally recall the info than the ones who were told the trivia wouldn't be saved. In other words, if we know a digital tool is going to remember a fact, we're slightly less likely to remember it ourselves
  • believing that one won't have access to the information in the future enhances memory for the information itself, whereas believing the information was saved externally enhances memory for the fact that the information could be accessed.
  • Just as we learn through transactive memory who knows what in our families and offices, we are learning what the computer 'knows' and when we should attend to where we have stored information in our computer-based memories,
  • We’ve stored a huge chunk of what we “know” in people around us for eons. But we rarely recognize this because, well, we prefer our false self-image as isolated, Cartesian brains
  • We’re dumber and less cognitively nimble if we're not around other people—and, now, other machines.
  • When humans spew information at us unbidden, it's boorish. When machines do it, it’s enticing.
  • Though you might assume search engines are mostly used to answer questions, some research has found that up to 40 percent of all queries are acts of remembering. We're trying to refresh the details of something we've previously encountered.
  • So humanity has always relied on coping devices to handle the details for us. We’ve long stored knowledge in books, paper, Post-it notes
  • We need to develop literacy in these tools the way we teach kids how to spell and write; we need to be skeptical about search firms’ claims of being “impartial” referees of information
  • And on an individual level, it’s still important to slowly study and deeply retain things, not least because creative thought—those breakthrough ahas—come from deep and often unconscious rumination, your brain mulling over the stuff it has onboard.
  • you can stop worrying about your iPhone moving your memory outside your head. It moved out a long time ago—yet it’s still all around you.
kortanekev

Scientists Build New Computer Made of DNA - 0 views

  • Scientists at the University of Manchester have developed a new type of self-replicating computer that uses DNA to make calculations, a breakthrough that could make computing far more efficient.
  •  
    what are the ethical implications of a computer that functions much like we do... but better? Could a "DNA computer" program its own mutations? When computers do everything for us...what will be the pursuit of knowledge?  Evie K 3/4/17
Javier E

Creativity Becomes an Academic Discipline - NYTimes.com - 0 views

  • Once considered the product of genius or divine inspiration, creativity — the ability to spot problems and devise smart solutions — is being recast as a prized and teachable skill.
  • “The reality is that to survive in a fast-changing world you need to be creative,”
  • “That is why you are seeing more attention to creativity at universities,” he says. “The marketplace is demanding it.”
  • ...16 more annotations...
  • Creativity moves beyond mere synthesis and evaluation and is, he says, “the higher order skill.” This has not been a sudden development. Nearly 20 years ago “creating” replaced “evaluation” at the top of Bloom’s Taxonomy of learning objectives. In 2010 “creativity” was the factor most crucial for success found in an I.B.M. survey of 1,500 chief executives in 33 industries. These days “creative” is the most used buzzword in LinkedIn profiles two years running.
  • The method, which is used in Buffalo State classrooms, has four steps: clarifying, ideating, developing and implementing. People tend to gravitate to particular steps, suggesting their primary thinking style.
  • What’s igniting campuses, though, is the conviction that everyone is creative, and can learn to be more so.
  • Just about every pedagogical toolbox taps similar strategies, employing divergent thinking (generating multiple ideas) and convergent thinking (finding what works).The real genius, of course, is in the how.
  • as content knowledge evolves at lightning speed, educators are talking more and more about “process skills,” strategies to reframe challenges and extrapolate and transform information, and to accept and deal with ambiguity.
  • Ideating is brainstorming and calls for getting rid of your inner naysayer to let your imagination fly.
  • Clarifying — asking the right question — is critical because people often misstate or misperceive a problem. “If you don’t have the right frame for the situation, it’s difficult to come up with a breakthrough,
  • Developing is building out a solution, and maybe finding that it doesn’t work and having to start over
  • Implementing calls for convincing others that your idea has value.
  • “the frequency and intensity of failures is an implicit principle of the course. Getting into a creative mind-set involves a lot of trial and error.”
  • His favorite assignments? Construct a résumé based on things that didn’t work out and find the meaning and influence these have had on your choices.
  • “Examine what in the culture is preventing you from creating something new or different. And what is it like to look like a fool because a lot of things won’t work out and you will look foolish? So how do you handle that?”
  • Because academics run from failure, Mr. Keywell says, universities are “way too often shapers of formulaic minds,” and encourage students to repeat and internalize fail-safe ideas.
  • “The new people who will be creative will sit at the juxtaposition of two or more fields,” she says. When ideas from different fields collide, Dr. Cramond says, fresh ones are generated.
  • Basic creativity tools used at the Torrance Center include thinking by analogy, looking for and making patterns, playing, literally, to encourage ideas, and learning to abstract problems to their essence.
  • students explore definitions of creativity, characteristics of creative people and strategies to enhance their own creativity.These include rephrasing problems as questions, learning not to instinctively shoot down a new idea (first find three positives), and categorizing problems as needing a solution that requires either action, planning or invention.
Javier E

Interview: Ted Chiang | The Asian American Literary Review - 0 views

  • I think most people’s ideas of science fiction are formed by Hollywood movies, so they think most science fiction is a special effects-driven story revolving around a battle between good and evil
  • I don’t think of that as a science fiction story. You can tell a good-versus-evil story in any time period and in any setting. Setting it in the future and adding robots to it doesn’t make it a science fiction story.
  • I think science fiction is fundamentally a post-industrial revolution form of storytelling. Some literary critics have noted that the good-versus-evil story follows a pattern where the world starts out as a good place, evil intrudes, the heroes fight and eventually defeat evil, and the world goes back to being a good place. Those critics have said that this is fundamentally a conservative storyline because it’s about maintaining the status quo. This is a common story pattern in crime fiction, too—there’s some disruption to the order, but eventually order is restored. Science fiction offers a different kind of story, a story where the world starts out as recognizable and familiar but is disrupted or changed by some new discovery or technology. At the end of the story, the world is changed permanently. The original condition is never restored. And so in this sense, this story pattern is progressive because its underlying message is not that you should maintain the status quo, but that change is inevitable. The consequences of this new discovery or technology—whether they’re positive or negative—are here to stay and we’ll have to deal with them.
  • ...3 more annotations...
  • There’s also a subset of this progressive story pattern that I’m particularly interested in, and that’s the “conceptual breakthrough” story, where the characters discover something about the nature of the universe which radically expands their understanding of the world.  This is a classic science fiction storyline.
  • one of the cool things about science fiction is that it lets you dramatize the process of scientific discovery, that moment of suddenly understanding something about the universe. That is what scientists find appealing about science, and I enjoy seeing the same thing in science fiction.
  • when you mention myth or mythic structure, yes, I don’t think myths can do that, because in general, myths reflect a pre-industrial view of the world. I don’t know if there is room in mythology for a strong conception of the future, other than an end-of-the-world or Armageddon scenario …
Ryan Beneck

Future Timeline - 0 views

  • Welcome to the future! Below, you will find a speculative timeline of future history. Part fact and part fiction, the timeline is based on detailed research that includes analysis of current trends, long-term environmental changes, advances in technology such as Moore's Law, future medical breakthroughs, the evolving geopolitical landscape and more.
Adam Clark

The Skull - Radiolab - 0 views

  •  
    "Today, the story of one little thing that has radically changed what we know about humanity's humble beginnings and the kinds of creatures that were out to get us way back when."
1 - 20 of 52 Next › Last »
Showing 20 items per page