Skip to main content

Home/ TOK Friends/ Group items tagged epiphany

Rss Feed Group items tagged

Javier E

Did Zeus Exist? - NYTimes.com - 0 views

  • This set me thinking about why we are so certain that Zeus never existed. Of course, we are in no position to say that he did.  But are we really in a position to say that he didn’t?
  • there were “epiphanies” that could “indicate not merely a visible or audible epiphany (whether in the light of day or through a dream . . .) but also any clear expression of a god’s favor such as weather conditions hampering an enemy, a miraculous escape, or a cure;
  • Why did belief in the gods persist in spite of critical challenges? What evidence seemed decisive to the ancient Greeks? 
  • ...7 more annotations...
  • what the Greeks saw as experiences of divine actions in their lives. ”The greatest evidence for the existence of gods is that piety works . . . the converse is that impiety leads to disaster,
  • There were also rituals, associated with the many cults of specific gods, that for some worshippers “created a sense of contact with the divine. One knows that the gods exist because one feels their presence during the drama of the mysteries or the elation of the choral dance.
  • as this civilization developed the critical tools of historiography and philosophy, Zeus’s reality remained widely unquestioned. 
  • how can we be so sure that the Greeks lived in the same sort of world as we do?  What decisive reason do we have for thinking that for them divinity was not a widely and deeply experienced fact of life?
  • shouldn’t we hold a merely agnostic position on Zeus and the other Greek gods, taking seriously the possibility that they existed but holding that we have good reason neither to assert nor deny their existence?
  • We may well think that our world contains little or no evidence of the supernatural.  But that is no reason to think the same was true of the Greek world.
  • I’m inclined to say that an atheistic denial of Zeus is ungrounded.  There is no current evidence of his present existence, but to deny that he existed in his Grecian heyday we need to assume that there was no good evidence for his existence available to the ancient Greeks.  We have no reason to make this assumption.
Javier E

Breathing In vs. Spacing Out - NYTimes.com - 0 views

  • Although pioneers like Jon Kabat-Zinn, now emeritus professor at the University of Massachusetts Medical Center, began teaching mindfulness meditation as a means of reducing stress as far back as the 1970s, all but a dozen or so of the nearly 100 randomized clinical trials have been published since 2005.
  • Michael Posner, of the University of Oregon, and Yi-Yuan Tang, of Texas Tech University, used functional M.R.I.’s before and after participants spent a combined 11 hours over two weeks practicing a form of mindfulness meditation developed by Tang. They found that it enhanced the integrity and efficiency of the brain’s white matter, the tissue that connects and protects neurons emanating from the anterior cingulate cortex, a region of particular importance for rational decision-making and effortful problem-solving.
  • Perhaps that is why mindfulness has proved beneficial to prospective graduate students. In May, the journal Psychological Science published the results of a randomized trial showing that undergraduates instructed to spend a mere 10 minutes a day for two weeks practicing mindfulness made significant improvement on the verbal portion of the Graduate Record Exam — a gain of 16 percentile points. They also significantly increased their working memory capacity, the ability to maintain and manipulate multiple items of attention.
  • ...7 more annotations...
  • By emphasizing a focus on the here and now, it trains the mind to stay on task and avoid distraction.
  • “Your ability to recognize what your mind is engaging with, and control that, is really a core strength,” said Peter Malinowski, a psychologist and neuroscientist at Liverpool John Moores University in England. “For some people who begin mindfulness training, it’s the first time in their life where they realize that a thought or emotion is not their only reality, that they have the ability to stay focused on something else, for instance their breathing, and let that emotion or thought just pass by.”
  • one of the most surprising findings of recent mindfulness studies is that it could have unwanted side effects. Raising roadblocks to the mind’s peregrinations could, after all, prevent the very sort of mental vacations that lead to epiphanies.
  • he found that having participants spend a brief period of time on an undemanding task that maximizes mind wandering improved their subsequent performance on a test of creativity. In a follow-up study, he reported that physicists and writers alike came up with their most insightful ideas while spacing out.
  • The trick is knowing when mindfulness is called for and when it’s not.
  • the higher adults scored on a measurement of mindfulness, the worse they performed on tests of implicit learning — the kind that underlies all sorts of acquired skills and habits but that occurs without conscious awareness.
  • “There’s so much our brain is doing when we’re not aware of it,” said the study’s leader, Chelsea Stillman, a doctoral candidate. “We know that being mindful is really good for a lot of explicit cognitive functions. But it might not be so useful when you want to form new habits.” Learning to ride a bicycle, speak grammatically or interpret the meaning of people’s facial expressions are three examples of knowledge we acquire through implicit learning
Javier E

'Trespassing on Einstein's Lawn,' by Amanda Gefter - NYTimes.com - 0 views

  • It all began when Warren Gefter, a radiologist “prone to posing Zen-koan-like questions,” asked his 15-year-old daughter, Amanda, over dinner at a Chinese restaurant near their home just outside Philadelphia: “How would you define nothing?”
  • “I think we should figure it out,” he said. And his teenage daughter — sullen, rebellious, wallowing in existential dread — smiled for the first time “in what felt like years.” The project proved to be a gift from a wise, insightful father. It was Warren Gefter’s way of rescuing his child.
  • “If observers create reality, where do the observers come from?” But the great man responded in riddles. “The universe is a self-­excited circuit,” Wheeler said. “The boundary of a boundary is zero.” The unraveling of these mysteries propels the next 400 or so pages.
  • ...6 more annotations...
  • She became a science journalist. At first it was a lark, a way to get free press passes to conferences where she and her father could ask questions of the greatest minds in quantum mechanics, string theory and cosmology. But within a short time, as she started getting assignments, journalism became a calling, and an identity.
  • Tracking down the meaning of nothing — and, by extension, secrets about the origin of the universe and whether observer-independent reality exists — became the defining project of their lives. They spent hours together working on the puzzle, two dark heads bent over their physics books far into the night.
  • she has an epiphany — that for something to be real, it must be invariant — she flies home to share it with her father. They discuss her insight over breakfast at a neighborhood haunt, where they make a list on what they will affectionately call “the IHOP napkin.” They list all the possible “ingredients of ultimate reality,” planning to test each item for whether it is “real,” that is whether it is invariant and can exist in the absence of an observer.
  • their readings and interviews reveal that each item in turn is observer-dependent. Space? Observer-dependent, and therefore not real. Gravity, electromagnetism, angular momentum? No, no, and no. In the end, every putative “ingredient of ultimate reality” is eliminated, including one they hadn’t even bothered to put on the list because it seemed weird to: reality itself
  • What remained was an unsettling and essential insight: that “physics isn’t the machinery behind the workings of the world; physics is the machinery behind the illusion that there is a world.”
  • In the proposal, she clarifies how cosmology and quantum mechanics have evolved as scientists come to grips with the fact that things they had taken to be real — quantum particles, space-time, gravity, dimension — turn out to be ­observer-dependent.
Javier E

The Backfire Effect « You Are Not So Smart - 0 views

  • corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.
  • Once something is added to your collection of beliefs, you protect it from harm. You do it instinctively and unconsciously when confronted with attitude-inconsistent information. Just as confirmation bias shields you when you actively seek information, the backfire effect defends you when the information seeks you, when it blindsides you. Coming or going, you stick to your beliefs instead of questioning them. When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.
  • Psychologists call stories like these narrative scripts, stories that tell you what you want to hear, stories which confirm your beliefs and give you permission to continue feeling as you already do. If believing in welfare queens protects your ideology, you accept it and move on.
  • ...8 more annotations...
  • Contradictory evidence strengthens the position of the believer. It is seen as part of the conspiracy, and missing evidence is dismissed as part of the coverup.
  • Most online battles follow a similar pattern, each side launching attacks and pulling evidence from deep inside the web to back up their positions until, out of frustration, one party resorts to an all-out ad hominem nuclear strike
  • you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs.
  • you spend much more time considering information you disagree with than you do information you accept. Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response
  • when your beliefs are challenged, you pore over the data, picking it apart, searching for weakness. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process you form more neural connections, build new memories and put out effort – once you finally move on, your original convictions are stronger than ever.
  • The backfire effect is constantly shaping your beliefs and memory, keeping you consistently leaning one way or the other through a process psychologists call biased assimilation.
  • They then separated subjects into two groups; one group said they believed homosexuality was a mental illness and one did not. Each group then read the fake studies full of pretend facts and figures suggesting their worldview was wrong. On either side of the issue, after reading studies which did not support their beliefs, most people didn’t report an epiphany, a realization they’ve been wrong all these years. Instead, they said the issue was something science couldn’t understand. When asked about other topics later on, like spanking or astrology, these same people said they no longer trusted research to determine the truth. Rather than shed their belief and face facts, they rejected science altogether.
  • As social media and advertising progresses, confirmation bias and the backfire effect will become more and more difficult to overcome. You will have more opportunities to pick and choose the kind of information which gets into your head along with the kinds of outlets you trust to give you that information. In addition, advertisers will continue to adapt, not only generating ads based on what they know about you, but creating advertising strategies on the fly based on what has and has not worked on you so far. The media of the future may be delivered based not only on your preferences, but on how you vote, where you grew up, your mood, the time of day or year – every element of you which can be quantified. In a world where everything comes to you on demand, your beliefs may never be challenged.
Javier E

'Our minds can be hijacked': the tech insiders who fear a smartphone dystopia | Technol... - 0 views

  • Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called “attention economy”: an internet shaped around the demands of an advertising economy.
  • “It is very common,” Rosenstein says, “for humans to develop things with the best of intentions and for them to have unintended, negative consequences.”
  • most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.
  • ...43 more annotations...
  • There is growing concern that as well as addicting users, technology is contributing toward so-called “continuous partial attention”, severely limiting people’s ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity – even when the device is turned off. “Everyone is distracted,” Rosenstein says. “All of the time.”
  • Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.
  • Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, “which scrubs out a lot of those external triggers” he writes about in his book, and recommended an app called Pocket Points that “rewards you for staying off your phone when you need to focus”.
  • “One reason I think it is particularly important for us to talk about this now is that we may be the last generation that can remember life before,” Rosenstein says. It may or may not be relevant that Rosenstein, Pearlman and most of the tech insiders questioning today’s attention economy are in their 30s, members of the last generation that can remember a world in which telephones were plugged into walls.
  • One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.
  • Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.
  • “The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”
  • He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,” Eyal writes.
  • The most seductive design, Harris explains, exploits the same psychological susceptibility that makes gambling so compulsive: variable rewards. When we tap those apps with red icons, we don’t know whether we’ll discover an interesting email, an avalanche of “likes”, or nothing at all. It is the possibility of disappointment that makes it so compulsive.
  • Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. “The idea is to remember that we are not powerless,” he said. “We are in control.
  • But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?
  • Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. “All of us are jacked into this system,” he says. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
  • Harris, who has been branded “the closest thing Silicon Valley has to a conscience”, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.
  • “I don’t know a more urgent problem than this,” Harris says. “It’s changing our democracy, and it’s changing our ability to have the conversations and relationships that we want with each other.” Harris went public – giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Google’s Mountain View headquarters.
  • He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.
  • The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel “insecure”, “worthless” and “need a confidence boost”. Such granular information, Harris adds, is “a perfect model of what buttons you can push in a particular person”.
  • Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive “likes” for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. “There’s no ethics,” he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.
  • It was Rosenstein’s colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
  • Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture people’s attention, even stumbling across highly effective design by accident.
  • It’s this that explains how the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears, rapidly became one of the most addictive and ubiquitous design features in modern technology. “Each time you’re swiping down, it’s like a slot machine,” Harris says. “You don’t know what’s coming next. Sometimes it’s a beautiful photo. Sometimes it’s just an ad.”
  • The reality TV star’s campaign, he said, had heralded a watershed in which “the new, digitally supercharged dynamics of the attention economy have finally crossed a threshold and become manifest in the political realm”.
  • “Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”
  • All of it, he says, is reward-based behaviour that activates the brain’s dopamine pathways. He sometimes finds himself clicking on the red icons beside his apps “to make them go away”, but is conflicted about the ethics of exploiting people’s psychological vulnerabilities. “It is not inherently evil to bring people back to your product,” he says. “It’s capitalism.”
  • He identifies the advent of the smartphone as a turning point, raising the stakes in an arms race for people’s attention. “Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
  • McNamee chooses his words carefully. “The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences,” he says. “The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models.”
  • But how can Google and Facebook be forced to abandon the business models that have transformed them into two of the most profitable companies on the planet?
  • McNamee believes the companies he invested in should be subjected to greater regulation, including new anti-monopoly rules. In Washington, there is growing appetite, on both sides of the political divide, to rein in Silicon Valley. But McNamee worries the behemoths he helped build may already be too big to curtail.
  • Rosenstein, the Facebook “like” co-creator, believes there may be a case for state regulation of “psychologically manipulative advertising”, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. “If we only care about profit maximisation,” he says, “we will go rapidly into dystopia.”
  • James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.
  • It is a journey that has led him to question whether democracy can survive the new technological age.
  • He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?
  • That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.
  • Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not “on the front page of every newspaper every day.
  • “Eighty-seven percent of people wake up and go to sleep with their smartphones,” he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.
  • g. “The attention economy incentivises the design of technologies that grab our attention,” he says. “In so doing, it privileges our impulses over our intentions.”
  • That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to “sensationalise, bait and entertain in order to survive”.
  • It is not just shady or bad actors who were exploiting the internet to change public opinion. The attention economy itself is set up to promote a phenomenon like Trump, who is masterly at grabbing and retaining the attention of supporters and critics alike, often by exploiting or creating outrage.
  • All of which has left Brichter, who has put his design work on the backburner while he focuses on building a house in New Jersey, questioning his legacy. “I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” he says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.
  • He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.
  • All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. “We’ve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium,” he says.
  • It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and “man’s almost infinite appetite for distractions”.
  • If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?
  • “The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.”
Javier E

The Problem With History Classes - The Atlantic - 3 views

  • The passion and urgency with which these battles are fought reflect the misguided way history is taught in schools. Currently, most students learn history as a set narrative—a process that reinforces the mistaken idea that the past can be synthesized into a single, standardized chronicle of several hundred pages. This teaching pretends that there is a uniform collective story, which is akin to saying everyone remembers events the same.
  • Yet, history is anything but agreeable. It is not a collection of facts deemed to be "official" by scholars on high. It is a collection of historians exchanging different, often conflicting analyses.
  • rather than vainly seeking to transcend the inevitable clash of memories, American students would be better served by descending into the bog of conflict and learning the many "histories" that compose the American national story.
  • ...18 more annotations...
  • Perhaps Fisher offers the nation an opportunity to divorce, once and for all, memory from history. History may be an attempt to memorialize and preserve the past, but it is not memory; memories can serve as primary sources, but they do not stand alone as history. A history is essentially a collection of memories, analyzed and reduced into meaningful conclusions—but that collection depends on the memories chosen.
  • Memories make for a risky foundation: As events recede further into the past, the facts are distorted or augmented by entirely new details
  • people construct unique memories while informing perfectly valid histories. Just as there is a plurality of memories, so, too, is there a plurality of histories.
  • Scholars who read a diverse set of historians who are all focused on the same specific period or event are engaging in historiography
  • This approach exposes textbooks as nothing more than a compilation of histories that the authors deemed to be most relevant and useful.
  • In historiography, the barrier between historian and student is dropped, exposing a conflict-ridden landscape. A diplomatic historian approaches an event from the perspective of the most influential statesmen (who are most often white males), analyzing the context, motives, and consequences of their decisions. A cultural historian peels back the objects, sights, and sounds of a period to uncover humanity’s underlying emotions and anxieties. A Marxist historian adopts the lens of class conflict to explain the progression of events. There are intellectual historians, social historians, and gender historians, among many others. Historians studying the same topic will draw different interpretations—sometimes radically so, depending on the sources they draw from
  • Jacoba Urist points out that history is "about explaining and interpreting past events analytically." If students are really to learn and master these analytical tools, then it is absolutely essential that they read a diverse set of historians and learn how brilliant men and women who are scrutinizing the same topic can reach different conclusions
  • Rather than constructing a curriculum based on the muddled consensus of boards, legislatures, and think tanks, schools should teach students history through historiography. The shortcomings of one historian become apparent after reading the work of another one on the list.
  • Although, as Urist notes, the AP course is "designed to teach students to think like historians," my own experience in that class suggests that it fails to achieve that goal.
  • The course’s framework has always served as an outline of important concepts aiming to allow educators flexibility in how to teach; it makes no reference to historiographical conflicts. Historiography was an epiphany for me because I had never before come face-to-face with how historians think and reason
  • When I took AP U.S. History, I jumbled these diverse histories into one indistinct narrative. Although the test involved open-ended essay questions, I was taught that graders were looking for a firm thesis—forcing students to adopt a side. The AP test also, unsurprisingly, rewards students who cite a wealth of supporting details
  • By the time I took the test in 2009, I was a master at "checking boxes," weighing political factors equally against those involving socioeconomics and ensuring that previously neglected populations like women and ethnic minorities received their due. I did not know that I was pulling ideas from different historiographical traditions. I still subscribed to the idea of a prevailing national narrative and served as an unwitting sponsor of synthesis, oblivious to the academic battles that made such synthesis impossible.
  • Although there may be an inclination to seek to establish order where there is chaos, that urge must be resisted in teaching history. Public controversies over memory are hardly new. Students must be prepared to confront divisiveness, not conditioned to shoehorn agreement into situations where none is possible
  • When conflict is accepted rather than resisted, it becomes possible for different conceptions of American history to co-exist. There is no longer a need to appoint a victor.
  • More importantly, the historiographical approach avoids pursuing truth for the sake of satisfying a national myth
  • The country’s founding fathers crafted some of the finest expressions of personal liberty and representative government the world has ever seen; many of them also held fellow humans in bondage. This paradox is only a problem if the goal is to view the founding fathers as faultless, perfect individuals. If multiple histories are embraced, no one needs to fear that one history will be lost.
  • History is not indoctrination. It is a wrestling match. For too long, the emphasis has been on pinning the opponent. It is time to shift the focus to the struggle itself
  • There is no better way to use the past to inform the present than by accepting the impossibility of a definitive history—and by ensuring that current students are equipped to grapple with the contested memories in their midst.
anonymous

The Aha! Moment: The Science Behind Creative Insight » Brain World - 0 views

  • The Aha! Moment: The Science Behind Creative Insight
  • For most of us, it usually occurs at the most inopportune times; never when we’re searching for it.
  • To Archimedes, it happened in the bathtub. Newton experienced it while wandering an apple orchard. Arthur Fry: church. Each encountered an epiphany, that powerful moment of spontaneous insight. Archimedes shouted Eureka! upon realizing how to calculate density and volume; to Newton came the law of universal gravity; to Arthur Fry, Post-it notes.
  • ...9 more annotations...
  • Behold the proverbial “aha!” moment — a key phenomenon that emerges in a range of situations, from offering a solution to a problem or a new interpretation of a situation to more simple feats such as understanding a joke or solving a crossword puzzle.
  • There are many different representations we use colloquially to describe good ideas — sparks, flashes, light-bulb moments; inspirations and innovations; muses and visions.
  • they usually materialize abruptly, without warning and seemingly out of thin air.
  • Laboratories and psychologists have attempted to study this phenomenon using behavioral methods for nearly a century, resulting merely in speculations as to where these ideas come from and how they form. Lately, though, with recent advancements and tools of cognitive neuroscience, researchers are able to explain the inner workings of the brain during moments of insigh
  • scientists have found that these sudden sparks are the result of a complex series of brain states.
  • Findings also suggest that we require more neural processes operating at different time scales in these moments than we use when solving a problem analytically or methodically.
  • Participants were presented with three words (e.g., crab, pine, sauce), and were instructed to think of a single word that forms a familiar two-word phrase with all three (e.g., apple can join with crab, pine, and sauce to form pineapple, crabapple, and applesauce). As soon as participants thought of a solution word, they pressed a button to indicate whether the answer had come to them suddenly (through insight), or if they used a methodical hypothesis testing approach — in other words, a trial-and-error approach.
  • Gamma activity indicates a constellation of neurons binding together for the first time in the brain to create a new neural network pathway.
  • This is the creation of a new idea. Immediately following that gamma spike, the new idea pops into our consciousness, which we identify as the aha! moment.
Javier E

Richard Blanco: Say This Isn't the End - The Atlantic - 0 views

  • ... say we live on, say we’ll forget the masks that kept us from dying from the invisible, but say we won’t ever forget the invisible masks we realized we had been wearing most our lives, disguising ourselves from each other
  • Say we won’t veil ourselves again, that our souls will keep breathing timelessly, that we won’t return to clocking our lives with lists and appointments. Say we’ll keep our days errant as sun showers, impulsive as a star’s falling. Say this isn’t our end …
  • I say this might be the end we’ve always needed to begin again. I say this may be the end to let us hope to heal, to evolve, reach the stars.
  • ...1 more annotation...
  • Again I’ll say: heal, evolve, reach and become the stars that became us— whether or not this is or is not our end.
1 - 8 of 8
Showing 20 items per page