Skip to main content

Home/ TOK Friends/ Group items tagged social skills

Rss Feed Group items tagged

Javier E

TikTok Brain Explained: Why Some Kids Seem Hooked on Social Video Feeds - WSJ - 0 views

  • Remember the good old days when kids just watched YouTube all day? Now that they binge on 15-second TikToks, those YouTube clips seem like PBS documentaries.
  • Many parents tell me their kids can’t sit through feature-length films anymore because to them the movies feel painfully slow. Others have observed their kids struggling to focus on homework. And reading a book? Forget about it.
  • What is happening to kids’ brains?
  • ...27 more annotations...
  • “It is hard to look at increasing trends in media consumption of all types, media multitasking and rates of ADHD in young people and not conclude that there is a decrease in their attention span,
  • Emerging research suggests that watching short, fast-paced videos makes it harder for kids to sustain activities that don’t offer instant—and constant—gratification.
  • One of the few studies specifically examining TikTok-related effects on the brain focused on Douyin, the TikTok equivalent in China, made by the same Chinese parent company, ByteDance Ltd. It found that the personalized videos the app’s recommendation engine shows users activate the reward centers of the brain, as compared with the general-interest videos shown to new users.
  • Brain scans of Chinese college students showed that areas involved in addiction were highly activated in those who watched personalized videos.
  • It also found some people have trouble controlling when to stop watching.
  • attention. “If kids’ brains become accustomed to constant changes, the brain finds it difficult to adapt to a nondigital activity where things don’t move quite as fast,”
  • A TikTok spokeswoman said the company wants younger teens to develop positive digital habits early on, and that it recently made some changes aimed at curbing extensive app usage. For example, TikTok won’t allow users ages 13 to 15 to receive push notifications after 9 p.m. TikTok also periodically reminds users to take a break to go outside or grab a snack.
  • Kids have a hard time pulling away from videos on YouTube, too, and Google has made several changes to help limit its use, including turning off autoplay by default on accounts of people under 18.
  • When kids do things that require prolonged focus, such as reading or solving math problems, they’re using directed attention
  • This function starts in the prefrontal cortex, the part of the brain responsible for decision making and impulse control.
  • “Directed attention is the ability to inhibit distractions and sustain attention and to shift attention appropriately. It requires higher-order skills like planning and prioritizing,”
  • Kids generally have a harder time doing this—and putting down their videogame controllers—because the prefrontal cortex isn’t fully developed until age 25.
  • “We speculate that individuals with lower self-control ability have more difficulty shifting attention away from favorite video stimulation,
  • “In the short-form snackable world, you’re getting quick hit after quick hit, and as soon as it’s over, you have to make a choice,” said Mass General’s Dr. Marci, who wrote the new book “Rewired: Protecting Your Brain in the Digital Age.” The more developed the prefrontal cortex, the better the choices.
  • Dopamine is a neurotransmitter that gets released in the brain when it’s expecting a reward. A flood of dopamine reinforces cravings for something enjoyable, whether it’s a tasty meal, a drug or a funny TikTok video.
  • “TikTok is a dopamine machine,” said John Hutton, a pediatrician and director of the Reading & Literacy Discovery Center at Cincinnati Children’s Hospital. “If you want kids to pay attention, they need to practice paying attention.”
  • Researchers are just beginning to conduct long-term studies on digital media’s effects on kids’ brains. The National Institutes of Health is funding a study of nearly 12,000 adolescents as they grow into adulthood to examine the impact that many childhood experiences—from social media to smoking—have on cognitive development.
  • she predicts they will find that when brains repeatedly process rapid, rewarding content, their ability to process less-rapid, less-rewarding things “may change or be harmed.”
  • “It’s like we’ve made kids live in a candy store and then we tell them to ignore all that candy and eat a plate of vegetables,”
  • “We have an endless flow of immediate pleasures that’s unprecedented in human history.”
  • Parents and kids can take steps to boost attention, but it takes effort
  • Swap screen time for real time. Exercise and free play are among the best ways to build attention during childhood,
  • “Depriving kids of tech doesn’t work, but simultaneously reducing it and building up other things, like playing outside, does,”
  • Practice restraint.
  • “When you practice stopping, it strengthens those connections in the brain to allow you to stop again next time.”
  • Use tech’s own tools. TikTok has a screen-time management setting that allows users to cap their app usage.
  • Ensure good sleep. Teens are suffering from a sleep deficit.
Javier E

How Tech Can Turn Doctors Into Clerical Workers - The New York Times - 0 views

  • what I see in my colleague is disillusionment, and it has come too early, and I am seeing too much of it.
  • In America today, the patient in the hospital bed is just the icon, a place holder for the real patient who is not in the bed but in the computer. That virtual entity gets all our attention. Old-fashioned “bedside” rounds conducted by the attending physician too often take place nowhere near the bed but have become “card flip” rounds
  • My young colleague slumping in the chair in my office survived the student years, then three years of internship and residency and is now a full-time practitioner and teacher. The despair I hear comes from being the highest-paid clerical worker in the hospital: For every one hour we spend cumulatively with patients, studies have shown, we spend nearly two hours on our primitive Electronic Health Records, or “E.H.R.s,” and another hour or two during sacred personal time.
  • ...23 more annotations...
  • The living, breathing source of the data and images we juggle, meanwhile, is in the bed and left wondering: Where is everyone? What are they doing? Hello! It’s my body, you know
  • Our $3.4 trillion health care system is responsible for more than a quarter of a million deaths per year because of medical error, the rough equivalent of, say, a jumbo jet’s crashing every day.
  • I can get cash and account details all over America and beyond. Yet I can’t reliably get a patient record from across town, let alone from a hospital in the same state, even if both places use the same brand of E.H.R
  • the leading E.H.R.s were never built with any understanding of the rituals of care or the user experience of physicians or nurses. A clinician will make roughly 4,000 keyboard clicks during a busy 10-hour emergency-room shift
  • In the process, our daily progress notes have become bloated cut-and-paste monsters that are inaccurate and hard to wade through. A half-page, handwritten progress note of the paper era might in a few lines tell you what a physician really thought
  • so much of the E.H.R., but particularly the physical exam it encodes, is a marvel of fiction, because we humans don’t want to leave a check box empty or leave gaps in a template.
  • For a study, my colleagues and I at Stanford solicited anecdotes from physicians nationwide about patients for whom an oversight in the exam (a “miss”) had resulted in real consequences, like diagnostic delay, radiation exposure, therapeutic or surgical misadventure, even death. They were the sorts of things that would leave no trace in the E.H.R. because the recorded exam always seems complete — and yet the omission would be glaring and memorable to other physicians involved in the subsequent care. We got more than 200 such anecdotes.
  • The reason for these errors? Most of them resulted from exams that simply weren’t done as claimed. “Food poisoning” was diagnosed because the strangulated hernia in the groin was overlooked, or patients were sent to the catheterization lab for chest pain because no one saw the shingles rash on the left chest.
  • I worry that such mistakes come because we’ve gotten trapped in the bunker of machine medicine. It is a preventable kind of failure
  • How we salivated at the idea of searchable records, of being able to graph fever trends, or white blood counts, or share records at a keystroke with another institution — “interoperability”
  • The seriously ill patient has entered another kingdom, an alternate universe, a place and a process that is frightening, infantilizing; that patient’s greatest need is both scientific state-of-the-art knowledge and genuine caring from another human being. Caring is expressed in listening, in the time-honored ritual of the skilled bedside exam — reading the body — in touching and looking at where it hurts and ultimately in localizing the disease for patients not on a screen, not on an image, not on a biopsy report, but on their bodies.
  • What if the computer gave the nurse the big picture of who he was both medically and as a person?
  • a professor at M.I.T. whose current interest in biomedical engineering is “bedside informatics,” marvels at the fact that in an I.C.U., a blizzard of monitors from disparate manufacturers display EKG, heart rate, respiratory rate, oxygen saturation, blood pressure, temperature and more, and yet none of this is pulled together, summarized and synthesized anywhere for the clinical staff to use
  • What these monitors do exceedingly well is sound alarms, an average of one alarm every eight minutes, or more than 180 per patient per day. What is our most common response to an alarm? We look for the button to silence the nuisance because, unlike those in a Boeing cockpit, say, our alarms are rarely diagnosing genuine danger.
  • By some estimates, more than 50 percent of physicians in the United States have at least one symptom of burnout, defined as a syndrome of emotional exhaustion, cynicism and decreased efficacy at work
  • It is on the increase, up by 9 percent from 2011 to 2014 in one national study. This is clearly not an individual problem but a systemic one, a 4,000-key-clicks-a-day problem.
  • The E.H.R. is only part of the issue: Other factors include rapid patient turnover, decreased autonomy, merging hospital systems, an aging population, the increasing medical complexity of patients. Even if the E.H.R. is not the sole cause of what ails us, believe me, it has become the symbol of burnou
  • burnout is one of the largest predictors of physician attrition from the work force. The total cost of recruiting a physician can be nearly $90,000, but the lost revenue per physician who leaves is between $500,000 and $1 million, even more in high-paying specialties.
  • I hold out hope that artificial intelligence and machine-learning algorithms will transform our experience, particularly if natural-language processing and video technology allow us to capture what is actually said and done in the exam room.
  • as with any lab test, what A.I. will provide is at best a recommendation that a physician using clinical judgment must decide how to apply.
  • True clinical judgment is more than addressing the avalanche of blood work, imaging and lab tests; it is about using human skills to understand where the patient is in the trajectory of a life and the disease, what the nature of the patient’s family and social circumstances is and how much they want done.
  • Much of that is a result of poorly coordinated care, poor communication, patients falling through the cracks, knowledge not being transferred and so on, but some part of it is surely from failing to listen to the story and diminishing skill in reading the body as a text.
  • As he was nearing death, Avedis Donabedian, a guru of health care metrics, was asked by an interviewer about the commercialization of health care. “The secret of quality,” he replied, “is love.”/•/
Javier E

What's behind the confidence of the incompetent? This suddenly popular psychological ph... - 1 views

  • To test Darwin’s theory, the researchers quizzed people on several topics, such as grammar, logical reasoning and humor. After each test, they asked the participants how they thought they did. Specifically, participants were asked how many of the other quiz-takers they beat.
  • Dunning was shocked by the results, even though it confirmed his hypothesis. Time after time, no matter the subject, the people who did poorly on the tests ranked their competence much higher. On average, test takers who scored as low as the 10th percentile ranked themselves near the 70th percentile. Those least likely to know what they were talking about believed they knew as much as the experts.
  • Dunning and Kruger’s results have been replicated in at least a dozen different domains: math skills, wine tasting, chess, medical knowledge among surgeons and firearm safety among hunters.
  • ...8 more annotations...
  • Dunning-Kruger “offers an explanation for a kind of hubris,” said Steven Sloman, a cognitive psychologist at Brown University. “The fact is, that’s Trump in a nutshell. He’s a man with zero political skill who has no idea he has zero political skill. And it’s given him extreme confidence.”
  • What happens when the incompetent are unwilling to admit they have shortcomings? Are they so confident in their own perceived knowledge that they will reject the very idea of improvement? Not surprisingly (though no less concerning), Dunning’s follow-up research shows the poorest performers are also the least likely to accept criticism or show interest in self improvement.
  • Someone who has very little knowledge in a subject claims to know a lot.
  • the Dunning-Kruger effect. It’s not a disease, syndrome or mental illness; it is present in everybody to some extent, and it’s been around as long as human cognition, though only recently has it been studied and documented in social psychology
  • “Obviously it has to do with Trump and the various treatments that people have given him,” Dunning said, “So yeah, a lot of it is political. People trying to understand the other side. We have a massive rise in partisanship and it’s become more vicious and extreme, so people are reaching for explanations."
  • Even though President Trump’s statements are rife with errors, falsehoods or inaccuracies, he expresses great confidence in his aptitude. He says he does not read extensively because he solves problems “with very little knowledge other than the knowledge I [already] had.” He has said in interviews he doesn’t read lengthy reports because “I already know exactly what it is.”
  • the Dunning-Kruger effect has become popular outside of the research world because it is a simple phenomenon that could apply to all of us
  • The ramifications of the Dunning-Kruger effect are usually harmless. If you’ve ever felt confident answering questions on an exam, only to have the teacher mark them incorrect, you have firsthand experience with Dunning-Kruger.
Javier E

Why Elders Smile - NYTimes.com - 1 views

  • When researchers ask people to assess their own well-being, people in their 20s rate themselves highly. Then there’s a decline as people get sadder in middle age, bottoming out around age 50. But then happiness levels shoot up, so that old people are happier than young people. The people who rate themselves most highly are those ages 82 to 85.
  • Older people are more relaxed, on average. They are spared some of the burden of thinking about the future. As a result, they get more pleasure out of present, ordinary activities.
  • I’d rather think that elder happiness is an accomplishment, not a condition, that people get better at living through effort, by mastering specific skills. I’d like to think that people get steadily better at handling life’s challenges. In middle age, they are confronted by stressful challenges they can’t control, like having teenage children. But, in old age, they have more control over the challenges they will tackle and they get even better at addressing them.
  • ...10 more annotations...
  • Aristotle teaches us that being a good person is not mainly about learning moral rules and following them. It is about performing social roles well — being a good parent or teacher or lawyer or friend.
  • First, there’s bifocalism, the ability to see the same situation from multiple perspectives.
  • “Anyone who has worn bifocal lenses knows that it takes time to learn to shift smoothly between perspectives and to combine them in a single field of vision. The same is true of deliberation. It is difficult to be compassionate, and often just as difficult to be detached, but what is most difficult of all is to be both at once.”
  • Only with experience can a person learn to see a fraught situation both close up, with emotional intensity, and far away, with detached perspective.
  • Then there’s lightness, the ability to be at ease with the downsides of life.
  • while older people lose memory they also learn that most setbacks are not the end of the world. Anxiety is the biggest waste in life. If you know that you’ll recover, you can save time and get on with it sooner.
  • Then there is the ability to balance tensions. In “Practical Wisdom,” Barry Schwartz and Kenneth Sharpe argue that performing many social roles means balancing competing demands. A doctor has to be honest but also kind. A teacher has to instruct but also inspire.
  • You can’t find the right balance in each context by memorizing a rule book. This form of wisdom can only be earned by acquiring a repertoire of similar experiences.
  • Finally, experienced heads have intuitive awareness of the landscape of reality, a feel for what other people are thinking and feeling, an instinct for how events will flow.
  • a lifetime of intellectual effort can lead to empathy and pattern awareness. “What I have lost with age in my capacity for hard mental work,” Goldberg writes, “I seem to have gained in my capacity for instantaneous, almost unfairly easy insight.”
tongoscar

Three quarters of refugees feel welcome in Germany | News | DW | 18.02.2020 - 0 views

shared by tongoscar on 19 Feb 20 - No Cached
  • Around three-quarters of refugees in Germany feel welcome in the country, according to a recent survey conducted by Germany's Federal Office for Migration and Refugees (BAMF). The survey, which was published on Tuesday, claimed that key factors determining overall life satisfaction for refugees in Germany depended on their family situation and health, residence status and housing, employment status, and the extent of social interaction with local Germans.
  • "All in all, the refugees assessed their living conditions in Germany rather positively," project manager Nina Rother said. "The feeling of being welcomed in Germany also plays an important role."
  • The joint study was conducted in collaboration with Germany's Institute for Employment Research (IAB). The report interviewed a total of 7,950 refugees who came to Germany between 2013 and 2016. According to the researchers, a good level of German is a central prerequisite for professional and social integration.
  • ...1 more annotation...
  • The survey stated that 44% of the refugees now rate their German language skills as "good" or "very good." In 2017, the figure stood at 35% and in the first survey in 2016, at only 22%. The proportion of respondents without any knowledge of German has fallen to 5%.
Javier E

Adam Kirsch: Art Over Biology | The New Republic - 1 views

  • Nietzsche, who wrote in Human, All Too Human, under the rubric “Art dangerous for the artist,” about the particular ill-suitedness of the artist to flourishing in a modern scientific age: When art seizes an individual powerfully, it draws him back to the views of those times when art flowered most vigorously.... The artist comes more and more to revere sudden excitements, believes in gods and demons, imbues nature with a soul, hates science, becomes unchangeable in his moods like the men of antiquity, and desires the overthrow of all conditions that are not favorable to art.... Thus between him and the other men of his period who are the same age a vehement antagonism is finally generated, and a sad end
  • What is modern is the sense of the superiority of the artist’s inferiority, which is only possible when the artist and the intellectual come to see the values of ordinary life—prosperity, family, worldly success, and happiness—as inherently contemptible.
  • Art, according to a modern understanding that has not wholly vanished today, is meant to be a criticism of life, especially of life in a materialist, positivist civilization such as our own. If this means the artist does not share in civilization’s boons, then his suffering will be a badge of honor.
  • ...18 more annotations...
  • The iron law of Darwinian evolution is that everything that exists strives with all its power to reproduce, to extend life into the future, and that every feature of every creature can be explained as an adaptation toward this end. For the artist to deny any connection with the enterprise of life, then, is to assert his freedom from this universal imperative; to reclaim negatively the autonomy that evolution seems to deny to human beings. It is only because we can freely choose our own ends that we can decide not to live for life, but for some other value that we posit. The artist’s decision to produce spiritual offspring rather than physical ones is thus allied to the monk’s celibacy and the warrior’s death for his country, as gestures that deny the empire of mere life.
  • Animals produce beauty on their bodies; humans can also produce it in their artifacts. The natural inference, then, would be that art is a human form of sexual display, a way for males to impress females with spectacularly redundant creations.
  • For Darwin, the human sense of beauty was not different in kind from the bird’s.
  • Still, Darwin recognized that the human sense of beauty was mediated by “complex ideas and trains of thought,” which make it impossible to explain in terms as straightforward as a bird’s:
  • Put more positively, one might say that any given work of art can be discussed critically and historically, but not deduced from the laws of evolution.
  • with the rise of evolutionary psychology, it was only a matter of time before the attempt was made to explain art in Darwinian terms. After all, if ethics and politics can be explained by game theory and reciprocal altruism, there is no reason why aesthetics should be different: in each case, what appears to be a realm of human autonomy can be reduced to the covert expression of biological imperatives
  • Still, there is an unmistakable sense in discussions of Darwinian aesthetics that by linking art to fitness, we can secure it against charges of irrelevance or frivolousness—that mattering to reproduction is what makes art, or anything, really matter.
  • The first popular effort in this direction was the late Denis Dutton’s much-discussed book The Art Instinct, which appeared in 2009.
  • Dutton’s Darwinism was aesthetically conservative: “Darwinian aesthetics,” he wrote, “can restore the vital place of beauty, skill, and pleasure as high artistic values.” Dutton’s argument has recently been reiterated and refined by a number of new books,
  • “The universality of art and artistic behaviors, their spontaneous appearance everywhere across the globe ... and the fact that in most cases they can be easily recognized as artistic across cultures suggest that they derive from a natural, innate source: a universal human psychology.”
  • Again like language, art is universal in the sense that any local expression of it can be “learned” by anyone.
  • Yet earlier theorists of evolution were reluctant to say that art was an evolutionary adaptation like language, for the simple reason that it does not appear to be evolutionarily adaptive.
  • Stephen Jay Gould suggested that art was not an evolutionary adaptation but what he called a “spandrel”—that is, a showy but accidental by-product of other adaptations that were truly functiona
  • the very words “success” and “failure,” despite themselves, bring an emotive and ethical dimension into the discussion, so impossible is it for human beings to inhabit a valueless world. In the nineteenth century, the idea that fitness for survival was a positive good motivated social Darwinism and eugenics. Proponents of these ideas thought that in some way they were serving progress by promoting the flourishing of the human race, when the basic premise of Darwinism is that there is no such thing as progress or regress, only differential rates of reproduction
  • In particular, Darwin suggests that it is impossible to explain the history or the conventions of any art by the general imperatives of evolution
  • Boyd begins with the premise that human beings are pattern-seeking animals: both our physical perceptions and our social interactions are determined by our brain’s innate need to find and to
  • Art, then, can be defined as the calisthenics of pattern-finding. “Just as animal physical play refines performance, flexibility, and efficiency in key behaviors,” Boyd writes, “so human art refines our performance in our key perceptual and cognitive modes, in sight (the visual arts), sound (music), and social cognition (story). These three modes of art, I propose, are adaptations ... they show evidence of special design in humans, design that offers survival and especially reproductive advantages.”
  • make coherent patterns
Javier E

Why It's OK to Let Apps Make You a Better Person - Evan Selinger - Technology - The Atl... - 0 views

  • one theme emerges from the media coverage of people's relationships with our current set of technologies: Consumers want digital willpower. App designers in touch with the latest trends in behavioral modification--nudging, the quantified self, and gamification--and good old-fashioned financial incentive manipulation, are tackling weakness of will. They're harnessing the power of payouts, cognitive biases, social networking, and biofeedback. The quantified self becomes the programmable self.
  • the trend still has multiple interesting dimensions
  • Individuals are turning ever more aspects of their lives into managerial problems that require technological solutions. We have access to an ever-increasing array of free and inexpensive technologies that harness incredible computational power that effectively allows us to self-police behavior everywhere we go. As pervasiveness expands, so does trust.
  • ...20 more annotations...
  • Some embrace networked, data-driven lives and are comfortable volunteering embarrassing, real time information about what we're doing, whom we're doing it with, and how we feel about our monitored activities.
  • Put it all together and we can see that our conception of what it means to be human has become "design space." We're now Humanity 2.0, primed for optimization through commercial upgrades. And today's apps are more harbinger than endpoint.
  • philosophers have had much to say about the enticing and seemingly inevitable dispersion of technological mental prosthetic that promise to substitute or enhance some of our motivational powers.
  • beyond the practical issues lie a constellation of central ethical concerns.
  • they should cause us to pause as we think about a possible future that significantly increases the scale and effectiveness of willpower-enhancing apps. Let's call this hypothetical future Digital Willpower World and characterize the ethical traps we're about to discuss as potential general pitfalls
  • it is antithetical to the ideal of " resolute choice." Some may find the norm overly perfectionist, Spartan, or puritanical. However, it is not uncommon for folks to defend the idea that mature adults should strive to develop internal willpower strong enough to avoid external temptations, whatever they are, and wherever they are encountered.
  • In part, resolute choosing is prized out of concern for consistency, as some worry that lapse of willpower in any context indicates a generally weak character.
  • Fragmented selves behave one way while under the influence of digital willpower, but another when making decisions without such assistance. In these instances, inconsistent preferences are exhibited and we risk underestimating the extent of our technological dependency.
  • It simply means that when it comes to digital willpower, we should be on our guard to avoid confusing situational with integrated behaviors.
  • the problem of inauthenticity, a staple of the neuroethics debates, might arise. People might start asking themselves: Has the problem of fragmentation gone away only because devices are choreographing our behavior so powerfully that we are no longer in touch with our so-called real selves -- the selves who used to exist before Digital Willpower World was formed?
  • Infantalized subjects are morally lazy, quick to have others take responsibility for their welfare. They do not view the capacity to assume personal responsibility for selecting means and ends as a fundamental life goal that validates the effort required to remain committed to the ongoing project of maintaining willpower and self-control.
  • Michael Sandel's Atlantic essay, "The Case Against Perfection." He notes that technological enhancement can diminish people's sense of achievement when their accomplishments become attributable to human-technology systems and not an individual's use of human agency.
  • Borgmann worries that this environment, which habituates us to be on auto-pilot and delegate deliberation, threatens to harm the powers of reason, the most central component of willpower (according to the rationalist tradition).
  • In several books, including Technology and the Character of Contemporary Life, he expresses concern about technologies that seem to enhance willpower but only do so through distraction. Borgmann's paradigmatic example of the non-distracted, focally centered person is a serious runner. This person finds the practice of running maximally fulfilling, replete with the rewarding "flow" that can only comes when mind/body and means/ends are unified, while skill gets pushed to the limit.
  • Perhaps the very conception of a resolute self was flawed. What if, as psychologist Roy Baumeister suggests, willpower is more "staple of folk psychology" than real way of thinking about our brain processes?
  • novel approaches suggest the will is a flexible mesh of different capacities and cognitive mechanisms that can expand and contract, depending on the agent's particular setting and needs. Contrary to the traditional view that identifies the unified and cognitively transparent self as the source of willed actions, the new picture embraces a rather diffused, extended, and opaque self who is often guided by irrational trains of thought. What actually keeps the self and its will together are the given boundaries offered by biology, a coherent self narrative created by shared memories and experiences, and society. If this view of the will as an expa
  • nding and contracting system with porous and dynamic boundaries is correct, then it might seem that the new motivating technologies and devices can only increase our reach and further empower our willing selves.
  • "It's a mistake to think of the will as some interior faculty that belongs to an individual--the thing that pushes the motor control processes that cause my action," Gallagher says. "Rather, the will is both embodied and embedded: social and physical environment enhance or impoverish our ability to decide and carry out our intentions; often our intentions themselves are shaped by social and physical aspects of the environment."
  • It makes perfect sense to think of the will as something that can be supported or assisted by technology. Technologies, like environments and institutions can facilitate action or block it. Imagine I have the inclination to go to a concert. If I can get my ticket by pressing some buttons on my iPhone, I find myself going to the concert. If I have to fill out an application form and carry it to a location several miles away and wait in line to pick up my ticket, then forget it.
  • Perhaps the best way forward is to put a digital spin on the Socratic dictum of knowing myself and submit to the new freedom: the freedom of consuming digital willpower to guide me past the sirens.
Javier E

Elon studies future of "Generation Always-On" - 1 views

  • Elon studies the future of "Generation Always-On"
  • By the year 2020, it is expected that youth of the “always-on generation,” brought up from childhood with a continuous connection to each other and to information, will be nimble, quick-acting multitaskers who count on the Internet as their external brain and who approach problems in a different way from their elders. "There is no doubt that brains are being rewired,"
  • the Internet Center, refers to the teens-to-20s age group born since the turn of the century as Generation AO, for “always-on." “They have grown up in a world that has come to offer them instant access to nearly the entirety of human knowledge, and incredible opportunities to connect, create and collaborate,"
  • ...10 more annotations...
  • some said they are already witnessing deficiencies in young peoples’ abilities to focus their attention, be patient and think deeply. Some experts expressed concerns that trends are leading to a future in which most people become shallow consumers of information, endangering society."
  • Many of the respondents in this survey predict that Gen AO will exhibit a thirst for instant gratification and quick fixes and a lack of patience and deep-thinking ability due to what one referred to as “fast-twitch wiring.”
  • “The replacement of memorization by analysis will be the biggest boon to society since the coming of mass literacy in the late 19th to early 20th century.” — Paul Jones, University of North Carolina-Chapel Hill
  • “Teens find distraction while working, distraction while driving, distraction while talking to the neighbours. Parents and teachers will have to invest major time and efforts into solving this issue – silence zones, time-out zones, meditation classes without mobile, lessons in ignoring people.”
  • “Society is becoming conditioned into dependence on technology in ways that, if that technology suddenly disappears or breaks down, will render people functionally useless. What does that mean for individual and social resiliency?
  • “Short attention spans resulting from quick interactions will be detrimental to focusing on the harder problems and we will probably see a stagnation in many areas: technology, even social venues such as literature. The people who will strive and lead the charge will be the ones able to disconnect themselves to focus.”
  • “The underlying issue is that they will become dependent on the Internet in order to solve problems and conduct their personal, professional, and civic lives. Thus centralized powers that can control access to the Internet will be able to significantly control future generations. It will be much as in Orwell's 1984, where control was achieved by using language to shape and limit thought, so future regimes may use control of access to the Internet to shape and limit thought.”
  • “Increasingly, teens and young adults rely on the first bit of information they find on a topic, assuming that they have found the ‘right’ answer, rather than using context and vetting/questioning the sources of information to gain a holistic view of a topic.”
  • “Parents and kids will spend less time developing meaningful and bonded relationships in deference to the pursuit and processing of more and more segmented information competing for space in their heads, slowly changing their connection to humanity.”
  • “It’s simply not possible to discuss, let alone form societal consensus around major problems without lengthy, messy conversations about those problems. A generation that expects to spend 140 or fewer characters on a topic and rejects nuance is incapable of tackling these problems.”
Javier E

How Memory Works: Interview with Psychologist Daniel L. Schacter | History News Network - 2 views

  • knowledge from a scientific perspective of how human memory works can be instructive to historians.
  • Memory is much more than a simple retrieval system, as Dr. Schacter has demonstrated in his research. Rather, the nature of memory is constructive and influenced by a person’s current state as well as intervening emotions, beliefs, events and other factors since a recalled event.
  • Dr. Schacter is William R. Kenan, Jr. Professor of Psychology at Harvard University. His books include Searching for Memory: The Brain, The Mind, and The Past, and The Seven Sins of Memory: How the Mind Forgets and Remembers, both winners of the American Psychological Association’s William James Book Award, and Forgotten Ideas, Neglected Pioneers: Richard Semon and the Story of Memory. He also has written hundreds of articles on memory and related matters. He was elected a Fellow of the American Academy of Arts and Sciences in 1996 and the National Academy of Sciences in 2013.
  • ...16 more annotations...
  • that memory is not a video recorder [but that] it’s a constructive activity that is in many ways accurate but prone to interesting errors and distortions. It’s the constructive side of memory that is most relevant to historians.
  • Is it the case then that our memories constantly change every time we access them?
  • That certainly can happen depending on how you recount a memory. What you emphasize. What you exaggerate. What you don’t talk about. All of those things will shape and sculpt the memory for future use. Certainly the potential is there.
  • Research on memory shows that the more distant in time the event, the more prone to inaccuracy the memory. There are several experiments when subjects recorded impressions of an event soon afterward, then a year later and then a few years later, and the memory changed.Yes. It’s not that the information is lost but, as the memory weakens, you become more prone to incorporating other kinds of information or mixing up elements of other events. This has been seen, for example, in the study of flashbulb memories. Where were you when Kennedy was shot? Where were you when you heard about 9/11?
  • Isn’t there a tendency to add details or information that may make the story more convincing or interesting later?Yes. That’s more a social function of memory. It may be that you draw on your general knowledge and probable information from your memory in a social context where there may be social demands that lead you distort the memory.
  • What are the different memory systems?
  • What is the difference between working memory and permanent memory?Working memory is really a temporary memory buffer where you hold onto information, manipulate information, use it, and it’s partly a gateway to long-term memory and also a buffer that you use when you’re retrieving information from long-term memory and that information temporarily resides in working memory, so to speak.
  • Your discussion of the testimony of White House Counsel John Dean about Watergate is illuminating. There was a perception that Dean had a photographic memory and he testified in rich detail about events. Yet later studies of White House tape recordings revealed that he was often inaccurate.
  • He was perceived because of all the detail with which he reported events and the great confidence to be something analogous to a human tape recorder. Yet there was interesting work done by psychologist Ulric Neisser who went back and analyzed what Dean said at the hearings as compared to available information on the White House taping system and basically found many and significant discrepancies between what Dean remembered and what was actually said. He usually had the gist and the meaning and overall significance right, but the exact details were often quite different in his memory than what actually was said.
  • That seems to get into the area of false memories and how they present problems in the legal system.We know from DNA exonerations of people wrongfully convicted of crimes that a large majority of those cases -- one of the more recent estimates is that in the first 250 cases of 2011 DNA exonerations, roughly 70 to 75 percent of those individuals were convicted on the basis of faulty eyewitness memory.
  • One of the interesting recent lines of research that my lab has been involved in over the past few years has been looking at similarities between what goes on between the brain and mind when we remember past events on the one hand and imagine events that might occur in the future or might have occurred in the past. What we have found, particularly with brain scanning studies, is that you get very similar brain networks coming online when you remember past events and imagine future events, for example. Many of the same brain regions or network of structures come online, and this has helped us understand more why, for example, imagining events that might have occurred can be so harmful to memory accuracy because when you imagine, you’re recruiting many of the same brain regions as accessed when you actually remember. So it’s not surprising that some of these imagined events can actually turn into false memories under the right circumstances.
  • One reasonably well accepted distinction involves episodic memory, the memory for personal experience; semantic memory, the memory for general knowledge; and procedural memory, the memory for skills and unconscious forms of memory.Those are three of the major kinds of memory and they all have different neural substrates.
  • One of the points from that Ross Perot study is that his supporters often misremembered what they felt like at the time he reported he had dropped out of the race. The nature of that misremembering depended on their state at the time they were remembering and what decisions they had made about Perot in the interim affected how they reconstructed their earlier memories.Again, that makes nicely the point that our current emotions and current appraisals of a situation can feed back into our reconstruction of the past and sometimes lead us to distort our memories so that they better support our current emotions and our current selves. We’re often using memories to justify what we currently know, believe and feel.
  • memory doesn’t work like a video camera or tape recorder.That is the main point. Our latest thinking on this is the idea that one of the major functions of memory is to support our ability to plan for the future, to imagine the future, and to use our past experiences in a flexible way to simulate different outcomes of events.
  • flexibility of memory is something that makes it useful to support this very important ability to run simulations of future events. But that very flexibility might be something that contributes to some of the memory distortion we talked about. That has been prominent in the last few years in my thinking about the constructive nature of memory.
  • The historian Daniel Aaron told his students “we remember what’s important.” What do you think of that comment?I think that generally holds true. Certainly, again, more important memories tend to be more significant with more emotional arousal and may elicit “deeper processing”, as we call it in cognitive psychology
Javier E

Stop Googling. Let's Talk. - The New York Times - 3 views

  • In a 2015 study by the Pew Research Center, 89 percent of cellphone owners said they had used their phones during the last social gathering they attended. But they weren’t happy about it; 82 percent of adults felt that the way they used their phones in social settings hurt the conversation.
  • I’ve been studying the psychology of online connectivity for more than 30 years. For the past five, I’ve had a special focus: What has happened to face-to-face conversation in a world where so many people say they would rather text than talk?
  • Young people spoke to me enthusiastically about the good things that flow from a life lived by the rule of three, which you can follow not only during meals but all the time. First of all, there is the magic of the always available elsewhere. You can put your attention wherever you want it to be. You can always be heard. You never have to be bored.
  • ...23 more annotations...
  • But the students also described a sense of loss.
  • A 15-year-old boy told me that someday he wanted to raise a family, not the way his parents are raising him (with phones out during meals and in the park and during his school sports events) but the way his parents think they are raising him — with no phones at meals and plentiful family conversation. One college junior tried to capture what is wrong about life in his generation. “Our texts are fine,” he said. “It’s what texting does to our conversations when we are together that’s the problem.”
  • One teacher observed that the students “sit in the dining hall and look at their phones. When they share things together, what they are sharing is what is on their phones.” Is this the new conversation? If so, it is not doing the work of the old conversation. The old conversation taught empathy. These students seem to understand each other less.
  • In 2010, a team at the University of Michigan led by the psychologist Sara Konrath put together the findings of 72 studies that were conducted over a 30-year period. They found a 40 percent decline in empathy among college students, with most of the decline taking place after 2000.
  • We’ve gotten used to being connected all the time, but we have found ways around conversation — at least from conversation that is open-ended and spontaneous, in which we play with ideas and allow ourselves to be fully present and vulnerable. But it is in this type of conversation — where we learn to make eye contact, to become aware of another person’s posture and tone, to comfort one another and respectfully challenge one another — that empathy and intimacy flourish. In these conversations, we learn who we are.
  • the trend line is clear. It’s not only that we turn away from talking face to face to chat online. It’s that we don’t allow these conversations to happen in the first place because we keep our phones in the landscape.
  • It’s a powerful insight. Studies of conversation both in the laboratory and in natural settings show that when two people are talking, the mere presence of a phone on a table between them or in the periphery of their vision changes both what they talk about and the degree of connection they feel. People keep the conversation on topics where they won’t mind being interrupted. They don’t feel as invested in each other. Even a silent phone disconnects us.
  • Yalda T. Uhls was the lead author on a 2014 study of children at a device-free outdoor camp. After five days without phones or tablets, these campers were able to read facial emotions and correctly identify the emotions of actors in videotaped scenes significantly better than a control group. What fostered these new empathic responses? They talked to one another. In conversation, things go best if you pay close attention and learn how to put yourself in someone else’s shoes. This is easier to do without your phone in hand. Conversation is the most human and humanizing thing that we do.
  • At a nightly cabin chat, a group of 14-year-old boys spoke about a recent three-day wilderness hike. Not that many years ago, the most exciting aspect of that hike might have been the idea of roughing it or the beauty of unspoiled nature. These days, what made the biggest impression was being phoneless. One boy called it “time where you have nothing to do but think quietly and talk to your friends.” The campers also spoke about their new taste for life away from the online feed. Their embrace of the virtue of disconnection suggests a crucial connection: The capacity for empathic conversation goes hand in hand with the capacity for solitude.
  • In solitude we find ourselves; we prepare ourselves to come to conversation with something to say that is authentic, ours. If we can’t gather ourselves, we can’t recognize other people for who they are. If we are not content to be alone, we turn others into the people we need them to be. If we don’t know how to be alone, we’ll only know how to be lonely.
  • we have put this virtuous circle in peril. We turn time alone into a problem that needs to be solved with technology.
  • People sometimes say to me that they can see how one might be disturbed when people turn to their phones when they are together. But surely there is no harm when people turn to their phones when they are by themselves? If anything, it’s our new form of being together.
  • But this way of dividing things up misses the essential connection between solitude and conversation. In solitude we learn to concentrate and imagine, to listen to ourselves. We need these skills to be fully present in conversation.
  • One start toward reclaiming conversation is to reclaim solitude. Some of the most crucial conversations you will ever have will be with yourself. Slow down sufficiently to make this possible. And make a practice of doing one thing at a time. Think of unitasking as the next big thing. In every domain of life, it will increase performance and decrease stress.
  • Multitasking comes with its own high, but when we chase after this feeling, we pursue an illusion. Conversation is a human way to practice unitasking.
  • Our phones are not accessories, but psychologically potent devices that change not just what we do but who we are. A second path toward conversation involves recognizing the degree to which we are vulnerable to all that connection offers. We have to commit ourselves to designing our products and our lives to take that vulnerability into account.
  • We can choose not to carry our phones all the time. We can park our phones in a room and go to them every hour or two while we work on other things or talk to other people. We can carve out spaces at home or work that are device-free, sacred spaces for the paired virtues of conversation and solitude.
  • Families can find these spaces in the day to day — no devices at dinner, in the kitchen and in the car.
  • Engineers are ready with more ideas: What if our phones were not designed to keep us attached, but to do a task and then release us? What if the communications industry began to measure the success of devices not by how much time consumers spend on them but by whether it is time well spent?
  • The young woman who is so clear about the seven minutes that it takes to see where a conversation is going admits that she often doesn’t have the patience to wait for anything near that kind of time before going to her phone. In this she is characteristic of what the psychologists Howard Gardner and Katie Davis called the “app generation,” which grew up with phones in hand and apps at the ready. It tends toward impatience, expecting the world to respond like an app, quickly and efficiently. The app way of thinking starts with the idea that actions in the world will work like algorithms: Certain actions will lead to predictable results.
  • This attitude can show up in friendship as a lack of empathy. Friendships become things to manage; you have a lot of them, and you come to them with tools
  • here is a first step: To reclaim conversation for yourself, your friendships and society, push back against viewing the world as one giant app. It works the other way, too: Conversation is the antidote to the algorithmic way of looking at life because it teaches you about fluidity, contingency and personality.
  • We have time to make corrections and remember who we are — creatures of history, of deep psychology, of complex relationships, of conversations, artless, risky and face to face.
Javier E

Anxious Students Strain College Mental Health Centers - NYTimes.com - 0 views

  • Anxiety has now surpassed depression as the most common mental health diagnosis among college students, though depression, too, is on the rise. More than half of students visiting campus clinics cite anxiety as a health concern,
  • Nearly one in six college students has been diagnosed with or treated for anxiety within the last 12 months
  • The causes range widely, experts say, from mounting academic pressure at earlier ages to overprotective parents to compulsive engagement with social media.
  • ...10 more annotations...
  • the consensus among therapists is that treating anxiety has become an enormous challenge for campus mental health centers.
  • More students are seeking help partly because the stigma around mental health issues is lessening
  • Because of escalating pressures during high school, he and other experts say, students arrive at college preloaded with stress. Accustomed to extreme parental oversight, many seem unable to steer themselves. And with parents so accessible, students have had less incentive to develop life skills.
  • Social media is a gnawing, roiling constant. As students see posts about everyone else’s fabulous experiences, the inevitable comparisons erode their self-esteem. The popular term is “FOMO” — fear of missing out.
  • Anxiety is an umbrella term for several disorders, including social anxiety disorder and agoraphobia. It can accompany many other diagnoses, such as depression, and it can be persistent and incapacitating.
  • Students who suffer from this acute manifestation can feel their very real struggles are shrugged off, because anxiety has become so ubiquitous, almost a cliché, on campus.
  • More often, anxiety is mild, intermittent or temporary, the manifestation of a student in the grip of a normal developmental issue — learning time management, for example, or how to handle rejection from a sorority.
  • Mild anxiety is often treatable with early, modest interventions. But to care for rising numbers of severely troubled students, many counseling centers have moved to triage protocols.
  • at Penn State, who have tracked campus counseling centers nationwide for six years, have documented a trend that other studies have noted: Students are arriving with ever more severe mental-health issues
  • Half of clients at mental health centers in their most recent report had already had some form of counseling before college. One-third have taken psychiatric medication. One quarter have self-injured.
Javier E

Ivy League Schools Are Overrated. Send Your Kids Elsewhere. | New Republic - 1 views

  • a blizzard of admissions jargon that I had to pick up on the fly. “Good rig”: the transcript exhibits a good degree of academic rigor. “Ed level 1”: parents have an educational level no higher than high school, indicating a genuine hardship case. “MUSD”: a musician in the highest category of promise. Kids who had five or six items on their list of extracurriculars—the “brag”—were already in trouble, because that wasn’t nearly enough.
  • With so many accomplished applicants to choose from, we were looking for kids with something special, “PQs”—personal qualities—that were often revealed by the letters or essays. Kids who only had the numbers and the résumé were usually rejected: “no spark,” “not a team-builder,” “this is pretty much in the middle of the fairway for us.” One young person, who had piled up a truly insane quantity of extracurriculars and who submitted nine letters of recommendation, was felt to be “too intense.”
  • On the other hand, the numbers and the résumé were clearly indispensable. I’d been told that successful applicants could either be “well-rounded” or “pointy”—outstanding in one particular way—but if they were pointy, they had to be really pointy: a musician whose audition tape had impressed the music department, a scientist who had won a national award.
  • ...52 more annotations...
  • When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine.
  • Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.
  • “Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance
  • Like so many kids today, I went off to college like a sleepwalker. You chose the most prestigious place that let you in; up ahead were vaguely understood objectives: status, wealth—“success.” What it meant to actually get an education and why you might want one—all this was off the table.
  • It was only after 24 years in the Ivy League—college and a Ph.D. at Columbia, ten years on the faculty at Yale—that I started to think about what this system does to kids and how they can escape from it, what it does to our society and how we can dismantle it.
  • I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice.
  • Look beneath the façade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.
  • So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk.
  • There are exceptions, kids who insist, against all odds, on trying to get a real education. But their experience tends to make them feel like freaks. One student told me that a friend of hers had left Yale because she found the school “stifling to the parts of yourself that you’d call a soul.”
  • What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? What, in short, is college for?
  • The first thing that college is for is to teach you to think.
  • College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance.
  • it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. The job of college is to assist you to begin to do that. Books, ideas, works of art and thought, the pressure of the minds around you that are looking for their own answers in their own ways.
  • College is not the only chance to learn to think, but it is the best. One thing is certain: If you haven’t started by the time you finish your B.A., there’s little likelihood you’ll do it later. That is why an undergraduate experience devoted exclusively to career preparation is four years largely wasted.
  • Elite schools like to boast that they teach their students how to think, but all they mean is that they train them in the analytic and rhetorical skills that are necessary for success in business and the professions.
  • Everything is technocratic—the development of expertise—and everything is ultimately justified in technocratic terms.
  • Religious colleges—even obscure, regional schools that no one has ever heard of on the coasts—often do a much better job in that respect.
  • At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much
  • professors and students have largely entered into what one observer called a “nonaggression pact.”
  • higher marks for shoddier work.
  • today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses
  • they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige.
  • Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify
  • there is now a thriving sector devoted to producing essay-ready summers
  • To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society.
  • what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning.
  • The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things
  • As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell.
  • Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science
  • It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.”
  • t almost feels ridiculous to have to insist that colleges like Harvard are bastions of privilege, where the rich send their children to learn to walk, talk, and think like the rich. Don’t we already know this? They aren’t called elite colleges for nothing. But apparently we like pretending otherwise. We live in a meritocracy, after all.
  • Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals
  • That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present
  • The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself.
  • This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent
  • The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game
  • Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools.
  • s there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing
  • Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to.
  • Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it.
  • The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely
  • U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive.
  • The best option of all may be the second-tier—not second-rate—colleges, like Reed, Kenyon, Wesleyan, Sewanee, Mount Holyoke, and others. Instead of trying to compete with Harvard and Yale, these schools have retained their allegiance to real educational values.
  • Not being an entitled little shit is an admirable goal. But in the end, the deeper issue is the situation that makes it so hard to be anything else. The time has come, not simply to reform that system top to bottom, but to plot our exit to another kind of society altogether.
  • The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth
  • More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.
  • reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality
  • The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first.
  • I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.
  • High-quality public education, financed with public money, for the benefit of all
  • Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides.
  • We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy.
Javier E

The Art of Thinking Well - The New York Times - 1 views

  • Thaler et al. were only scratching the surface of our irrationality. Most behavioral economists study individual thinking. They do much of their research in labs where subjects don’t intimately know the people around them.
  • It’s when we get to the social world that things really get gnarly. A lot of our thinking is for bonding, not truth-seeking, so most of us are quite willing to think or say anything that will help us be liked by our group
  • This is where Alan Jacobs’s absolutely splendid forthcoming book “How to Think” comes in
  • ...11 more annotations...
  • Jacobs’s emphasis on the relational nature of thinking is essential for understanding why there is so much bad thinking in political life right now.
  • Jacobs makes good use of C. S. Lewis’s concept of the Inner Ring. In every setting — a school, a company or a society — there is an official hierarchy. But there may also be a separate prestige hierarchy, where the cool kids are. They are the Inner Ring.
  • think of how you really persuade people. Do you do it by writing thoughtful essays that carefully marshal facts? That works some of the time.
  • Jacobs notices that when somebody uses “in other words” to summarize another’s argument, what follows is almost invariably a ridiculous caricature of that argument, in order to win favor with the team.
  • Jacobs nicely shows how our thinking processes emerge from emotional life and moral character. If your heart and soul are twisted, your response to the world will be, too.
  • I’d say that if social life can get us into trouble, social life can get us out.
  • “The passion for the Inner Ring is most skillful in making a man who is not yet a very bad man do very bad things.”
  • the real way to persuade people is to create an attractive community that people want to join. If you do that, they’ll bend their opinions to yours. If you want people to be reasonable, create groups where it’s cool to be reasonable.
  • Jacobs mentions that at the Yale Political Union members are admired if they can point to a time when a debate totally changed their mind on something. That means they take evidence seriously; that means they can enter into another’s mind-set. It means they treat debate as a learning exercise and not just as a means to victory.
  • How many public institutions celebrate these virtues? The U.S. Senate? Most TV talk shows? Even the universities?
  • People will, for example, identify and attack what Jacobs calls the Repugnant Cultural Other — the group that is opposed to the Inner Ring, which must be assaulted to establish membership in it.
ilanaprincilus06

How the web distorts reality and impairs our judgement skills | Media Network | The Gua... - 0 views

  • IBM estimates that 90% of the world's online data has been created just in the past two years. What's more, it has made information more accessible than ever before.
  • However, rather than enhancing knowledge, the internet has produced an information glut or "infoxication".
  • Furthermore, since online content is often curated to fit our preferences, interests and personality, the internet can even enhance our existing biases and undermine our motivation to learn new things.
    • ilanaprincilus06
       
      When we see our preferences constantly being displayed, we are more likely to go back to wherever the information was or utilize that source, website, etc more often.
  • ...14 more annotations...
  • these filters will isolate people in information bubbles only partly of their own choosing, and the inaccurate beliefs they form as a result may be difficult to correct."
  • the proliferation of search engines, news aggregators and feed-ranking algorithms is more likely to perpetuate ignorance than knowledge.
  • It would seem that excessive social media use may intensify not only feelings of loneliness, but also ideological isolation.
    • ilanaprincilus06
       
      Would social media networks need to stop exploiting these preferences in order for us to limit ideological isolation?
  • "What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact."
  • Recent studies show that although most people consume information that matches their opinions, being exposed to conflicting views tends to reduce prejudice and enhance creative thinking.
  • the desire to prove ourselves right and maintain our current beliefs trumps any attempt to be creative or more open-minded.
  • "our objects of inquiry are not 'truth' or 'meaning' but rather configurations of consciousness. These are figures or patterns of knowledge, cognitive and practical attitudes, which emerge within a definite historical and cultural context."
  • the internet is best understood as a cultural lens through which we construct – or distort – reality.
  • we can only deal with this overwhelming range of choices by ignoring most of them.
  • trolling is so effective for enticing readers' comments, but so ineffective for changing their viewpoints.
  • Will accumulating facts help you understand the world?
    • ilanaprincilus06
       
      We must take an extra step past just reading/learning about facts and develop second order thinking about the claims/facts to truly gain a better sense of what is going on.
  • we have developed a dependency on technology, which has eclipsed our reliance on logic, critical thinking and common sense: if you can find the answer online, why bother thinking?
  • it is conceivable that individuals' capacity to evaluate and produce original knowledge will matter more than the actual acquisition of knowledge.
  • Good judgment and decision-making will be more in demand than sheer expertise or domain-specific knowledge.
clairemann

'At 47, I discovered I am autistic - suddenly so many things made sense' | Life and sty... - 0 views

  • I knew I was different and I had always been told I was “too sensitive”. But I don’t fit the dated Rain Man stereotype. I’m a CEO, I’m married, I have two children. Autism is often a hidden disability.
  • I always operated with some level of confusion.
  • My mind is always going a million miles an hour and I don’t really have an off switch. I need to finish what I start at any cost. Now I understand that is part of being autistic.
  • ...10 more annotations...
  • Einstein, Mozart, Michelangelo, Steve Jobs, Bill Gates – all these overachievers are widely believed to be, or have been, on the spectrum.
  • I used recreational drugs to smooth me through the challenges of social communication.
  • With every interaction, verbal or written, I go through a mental checklist: is my response appropriate? Is it relevant? Is this something only I am going to find interesting? Is my tone right? Trying to follow social rules and adapt to an allistic [non-autistic] world is exhausting. No one sees what is going on inside my head.
  • Misunderstandings in communication can blow up quite quickly.
  • I burnt out in my late 20s.
  • My greatest fear has been something I’ve always referred to as “the big alone”. Even when I’ve been in loving relationships, as I am now, there has been a terrible aloneness in not understanding why I am not like other people.
  • I was able to look back at situations and misunderstandings and understand what had happened. I’d been told my communication could be “off” sometimes – a bit intense, a bit abrupt.
  • Autistic children spend a lot of time “masking”, imitating so-called “normal” behaviour. They need to be able to experience their authentic selves.
  • Over the past five or 10 years, the concept of neurodiversity – the idea that these differences in our brains should be celebrated – has become better known. We deserve equality, respect and full social inclusion.
  • We need to start making space for neurodivergent people at school, at work, in life generally. Autistic people bring a whole new set of skills with them.
Javier E

Why Is It So Hard to Be Rational? | The New Yorker - 0 views

  • an unusually large number of books about rationality were being published this year, among them Steven Pinker’s “Rationality: What It Is, Why It Seems Scarce, Why It Matters” (Viking) and Julia Galef’s “The Scout Mindset: Why Some People See Things Clearly and Others Don’t” (Portfolio).
  • When the world changes quickly, we need strategies for understanding it. We hope, reasonably, that rational people will be more careful, honest, truthful, fair-minded, curious, and right than irrational ones.
  • And yet rationality has sharp edges that make it hard to put at the center of one’s life
  • ...43 more annotations...
  • You might be well-intentioned, rational, and mistaken, simply because so much in our thinking can go wrong. (“RATIONAL, adj.: Devoid of all delusions save those of observation, experience and reflection,”
  • You might be rational and self-deceptive, because telling yourself that you are rational can itself become a source of bias. It’s possible that you are trying to appear rational only because you want to impress people; or that you are more rational about some things (your job) than others (your kids); or that your rationality gives way to rancor as soon as your ideas are challenged. Perhaps you irrationally insist on answering difficult questions yourself when you’d be better off trusting the expert consensus.
  • Not just individuals but societies can fall prey to false or compromised rationality. In a 2014 book, “The Revolt of the Public and the Crisis of Authority in the New Millennium,” Martin Gurri, a C.I.A. analyst turned libertarian social thinker, argued that the unmasking of allegedly pseudo-rational institutions had become the central drama of our age: people around the world, having concluded that the bigwigs in our colleges, newsrooms, and legislatures were better at appearing rational than at being so, had embraced a nihilist populism that sees all forms of public rationality as suspect.
  • modern life would be impossible without those rational systems; we must improve them, not reject them. We have no choice but to wrestle with rationality—an ideal that, the sociologist Max Weber wrote, “contains within itself a world of contradictions.”
  • Where others might be completely convinced that G.M.O.s are bad, or that Jack is trustworthy, or that the enemy is Eurasia, a Bayesian assigns probabilities to these propositions. She doesn’t build an immovable world view; instead, by continually updating her probabilities, she inches closer to a more useful account of reality. The cooking is never done.
  • Rationality is one of humanity’s superpowers. How do we keep from misusing it?
  • Start with the big picture, fixing it firmly in your mind. Be cautious as you integrate new information, and don’t jump to conclusions. Notice when new data points do and do not alter your baseline assumptions (most of the time, they won’t alter them), but keep track of how often those assumptions seem contradicted by what’s new. Beware the power of alarming news, and proceed by putting it in a broader, real-world context.
  • Bayesian reasoning implies a few “best practices.”
  • Keep the cooked information over here and the raw information over there; remember that raw ingredients often reduce over heat
  • We want to live in a more rational society, but not in a falsely rationalized one. We want to be more rational as individuals, but not to overdo it. We need to know when to think and when to stop thinking, when to doubt and when to trust.
  • But the real power of the Bayesian approach isn’t procedural; it’s that it replaces the facts in our minds with probabilities.
  • Applied to specific problems—Should you invest in Tesla? How bad is the Delta variant?—the techniques promoted by rationality writers are clarifying and powerful.
  • the rationality movement is also a social movement; rationalists today form what is sometimes called the “rationality community,” and, as evangelists, they hope to increase its size.
  • In “Rationality,” “The Scout Mindset,” and other similar books, irrationality is often presented as a form of misbehavior, which might be rectified through education or socialization.
  • Greg tells me that, in his business, it’s not enough to have rational thoughts. Someone who’s used to pondering questions at leisure might struggle to learn and reason when the clock is ticking; someone who is good at reaching rational conclusions might not be willing to sign on the dotted line when the time comes. Greg’s hedge-fund colleagues describe as “commercial”—a compliment—someone who is not only rational but timely and decisive.
  • You can know what’s right but still struggle to do it.
  • Following through on your own conclusions is one challenge. But a rationalist must also be “metarational,” willing to hand over the thinking keys when someone else is better informed or better trained. This, too, is harder than it sounds.
  • For all this to happen, rationality is necessary, but not sufficient. Thinking straight is just part of the work. 
  • I found it possible to be metarational with my dad not just because I respected his mind but because I knew that he was a good and cautious person who had my and my mother’s best interests at heart.
  • between the two of us, we had the right ingredients—mutual trust, mutual concern, and a shared commitment to reason and to act.
  • Intellectually, we understand that our complex society requires the division of both practical and cognitive labor. We accept that our knowledge maps are limited not just by our smarts but by our time and interests. Still, like Gurri’s populists, rationalists may stage their own contrarian revolts, repeatedly finding that no one’s opinions but their own are defensible. In letting go, as in following through, one’s whole personality gets involved.
  • in truth, it maps out a series of escalating challenges. In search of facts, we must make do with probabilities. Unable to know it all for ourselves, we must rely on others who care enough to know. We must act while we are still uncertain, and we must act in time—sometimes individually, but often together.
  • The realities of rationality are humbling. Know things; want things; use what you know to get what you want. It sounds like a simple formula.
  • The real challenge isn’t being right but knowing how wrong you might be.By Joshua RothmanAugust 16, 2021
  • Writing about rationality in the early twentieth century, Weber saw himself as coming to grips with a titanic force—an ascendant outlook that was rewriting our values. He talked about rationality in many different ways. We can practice the instrumental rationality of means and ends (how do I get what I want?) and the value rationality of purposes and goals (do I have good reasons for wanting what I want?). We can pursue the rationality of affect (am I cool, calm, and collected?) or develop the rationality of habit (do I live an ordered, or “rationalized,” life?).
  • Weber worried that it was turning each individual into a “cog in the machine,” and life into an “iron cage.” Today, rationality and the words around it are still shadowed with Weberian pessimism and cursed with double meanings. You’re rationalizing the org chart: are you bringing order to chaos, or justifying the illogical?
  • For Aristotle, rationality was what separated human beings from animals. For the authors of “The Rationality Quotient,” it’s a mental faculty, parallel to but distinct from intelligence, which involves a person’s ability to juggle many scenarios in her head at once, without letting any one monopolize her attention or bias her against the rest.
  • In “The Rationality Quotient: Toward a Test of Rational Thinking” (M.I.T.), from 2016, the psychologists Keith E. Stanovich, Richard F. West, and Maggie E. Toplak call rationality “a torturous and tortured term,” in part because philosophers, sociologists, psychologists, and economists have all defined it differently
  • Galef, who hosts a podcast called “Rationally Speaking” and co-founded the nonprofit Center for Applied Rationality, in Berkeley, barely uses the word “rationality” in her book on the subject. Instead, she describes a “scout mindset,” which can help you “to recognize when you are wrong, to seek out your blind spots, to test your assumptions and change course.” (The “soldier mindset,” by contrast, encourages you to defend your positions at any cost.)
  • Galef tends to see rationality as a method for acquiring more accurate views.
  • Pinker, a cognitive and evolutionary psychologist, sees it instrumentally, as “the ability to use knowledge to attain goals.” By this definition, to be a rational person you have to know things, you have to want things, and you have to use what you know to get what you want.
  • Introspection is key to rationality. A rational person must practice what the neuroscientist Stephen Fleming, in “Know Thyself: The Science of Self-Awareness” (Basic Books), calls “metacognition,” or “the ability to think about our own thinking”—“a fragile, beautiful, and frankly bizarre feature of the human mind.”
  • A successful student uses metacognition to know when he needs to study more and when he’s studied enough: essentially, parts of his brain are monitoring other parts.
  • In everyday life, the biggest obstacle to metacognition is what psychologists call the “illusion of fluency.” As we perform increasingly familiar tasks, we monitor our performance less rigorously; this happens when we drive, or fold laundry, and also when we think thoughts we’ve thought many times before
  • The trick is to break the illusion of fluency, and to encourage an “awareness of ignorance.”
  • metacognition is a skill. Some people are better at it than others. Galef believes that, by “calibrating” our metacognitive minds, we can improve our performance and so become more rational
  • There are many calibration methods
  • nowing about what you know is Rationality 101. The advanced coursework has to do with changes in your knowledge.
  • Most of us stay informed straightforwardly—by taking in new information. Rationalists do the same, but self-consciously, with an eye to deliberately redrawing their mental maps.
  • The challenge is that news about distant territories drifts in from many sources; fresh facts and opinions aren’t uniformly significant. In recent decades, rationalists confronting this problem have rallied behind the work of Thomas Bayes
  • So-called Bayesian reasoning—a particular thinking technique, with its own distinctive jargon—has become de rigueur.
  • the basic idea is simple. When new information comes in, you don’t want it to replace old information wholesale. Instead, you want it to modify what you already know to an appropriate degree. The degree of modification depends both on your confidence in your preëxisting knowledge and on the value of the new data. Bayesian reasoners begin with what they call the “prior” probability of something being true, and then find out if they need to adjust it.
  • Bayesian reasoning is an approach to statistics, but you can use it to interpret all sorts of new information.
Javier E

For Lee Tilghman, There Is Life After Influencing - The New York Times - 0 views

  • At her first full-time job since leaving influencing, the erstwhile smoothie-bowl virtuoso Lee Tilghman stunned a new co-worker with her enthusiasm for the 9-to-5 grind.
  • The co-worker pulled her aside that first morning, wanting to impress upon her the stakes of that decision. “This is terrible,” he told her. “Like, I’m at a desk.”“You don’t get it,” Ms. Tilghman remembered saying. “You think you’re a slave, but you’re not.” He had it backward, she added. “When you’re an influencer, then you have chains on.’”
  • In the late 2010s, for a certain subset of millennial women, Ms. Tilghman was wellness culture, a warm-blooded mood board of Outdoor Voices workout sets, coconut oil and headstands. She had earned north of $300,000 a year — and then dropped more than 150,000 followers, her entire management team, and most of her savings to become an I.R.L. person.
  • ...8 more annotations...
  • The corporate gig, as a social media director for a tech platform, was a revelation. “I could just show up to work and do work,” Ms. Tilghman said. After she was done, she could leave. She didn’t have to be a brand. There’s no comments section at an office job.
  • In 2019, a Morning Consult report found that 54 percent of Gen Z and millennial Americans were interested in becoming influencers. (Eighty-six percent said they would be willing to post sponsored content for money.)
  • If social media has made audiences anxious, it’s driving creators to the brink. In 2021, the TikTok breakout star Charli D’Amelio said she had “lost the passion” for posting videos. A few months later, Erin Kern announced to her 600,000 Instagram followers that she would be deactivating her account @cottonstem; she had been losing her hair, and her doctors blamed work-induced stress
  • Other influencers faded without fanfare — teens whose mental health had taken too much of a hit and amateur influencers who stopped posting after an algorithm tweak tanked their metrics. Some had been at this for a decade or more, starting at 12 or 14 or 19.
  • She posted less, testing out new identities that she hoped wouldn’t touch off the same spiral that wellness had. There were dancing videos, dog photos, interior design. None of it stuck. (“You can change the niche, but you’re still going to be performing your life for content,” she explained over lunch.)
  • Ms. Tilghman’s problem — as the interest in the workshop, which she decided to cap at 15, demonstrated — is that she has an undeniable knack for this. In 2022, she started a Substack to continue writing, thinking of it as a calling card while she applied to editorial jobs; it soon amassed 20,000 subscribers. It once had a different name, but now it’s called “Offline Time.” The paid tier costs $5 a month.
  • Casey Lewis, who helms the After School newsletter about Gen Z consumer trends, predicts more pivots and exits. TikTok has elevated creators faster than other platforms and burned them out quicker, she said.
  • Ms. Lewis expects a swell of former influencers taking jobs with P.R. agencies, marketing firms and product development conglomerates. She pointed out that creators have experience not just in video and photo editing, but in image management, crisis communication and rapid response. “Those skills do transfer,” she said.
Javier E

The Ignorance Caucus - NYTimes.com - 0 views

  • Last year the Texas G.O.P. explicitly condemned efforts to teach “critical thinking skills,” because, it said, such efforts “have the purpose of challenging the student’s fixed beliefs and undermining parental authority.”
  • even when giving a speech intended to demonstrate his ope
  • nness to new ideas, Mr. Cantor felt obliged to give that caucus a shout-out, calling for a complete end to federal funding of social science research. Because it’s surely a waste of money seeking to understand the society we’re trying to change.
  • ...7 more annotations...
  • Mr. Cantor’s support for medical research is curiously limited. He’s all for developing new treatments, but he and his colleagues have adamantly opposed “comparative effectiveness research,” which seeks to determine how well such treatments work.
  • , Hillary Clinton said of her Republican critics, “They just will not live in an evidence-based world.”
  • in his home state of Virginia — have engaged in furious witch hunts against scientists who find evidence they don’t like. True, the state has finally agreed to study the growing risk of coastal flooding; Norfolk is among the American cities most vulnerable to climate change. But Republicans in the State Legislature have specifically prohibited the use of the words “sea-level rise.”
  • the parties aren’t just divided on values and policy views, they’re divided over epistemology. One side believes, at least in principle, in letting its policy views be shaped by facts; the other believes in suppressing the facts if they contradict its fixed beliefs.
  • while Democrats, being human, often read evidence selectively and choose to believe things that make them comfortable, there really isn’t anything equivalent to Republicans’ active hostility to collecting evidence in the first place.
  • for all the talk of reforming and reinventing the G.O.P., the ignorance caucus retains a firm grip on the party’s heart and mind.
  • It would be helpful to these discussions if we had a good grasp of the facts about firearms and violence. But we don’t, because back in the 1990s conservative politicians, acting on behalf of the National Rifle Association, bullied federal agencies into ceasing just about all research into the issue.
sissij

Fake Academe, Looking Much Like the Real Thing - The New York Times - 0 views

  • Academics need to publish in order to advance professionally, get better jobs or secure tenure.
  •  
    Academe is losing its meaning now because the society only sees how many journals you have published but not what you actually write in the journals. I think the growing business of academic publication fraud reflects that our society values our certificates more than our skills. The numerous articles on those "good" colleges also put pressure on teenagers and parent that a title means all. However, that shouldn't be core of education. There is never a shortcut to success. --Sissi (12/31/2016)
Javier E

What Is College For? - NYTimes.com - 0 views

  • 74 percent of graduates from four-year colleges say that their education was “very useful in helping them grow intellectually.”
  • there are serious concerns about the quality of this experience.  In particular, the university curriculum leaves students disengaged from the material they are supposed to be learning.  They see most of their courses as intrinsically “boring,” of value only if they provide training relevant to future employment or if the teacher has a pleasing (amusing, exciting, “relevant”) way of presenting the material. As a result, students spend only as much time as they need to get what they see as acceptable grades (on average, about 12 to 14 hour a week for all courses combined).  Professors have ceased to expect genuine engagement from students and often give good grades (B or better) to work that is at best minimally adequate.
  • This lack of academic engagement is real, even among schools with the best students and the best teachers, and it increases dramatically as the quality of the school decreases.  But it results from a basic misunderstanding — by both students and teachers — of what colleges are for.
  • ...5 more annotations...
  • First of all, they are not simply for the education of students.  This is an essential function, but the raison d’être of a college is to nourish a world of intellectual culture; that is, a world of ideas, dedicated to what we can know scientifically, understand humanistically, or express artistically.  In our society, this world is mainly populated by members of college faculties: scientists, humanists, social scientists (who straddle the humanities and the sciences properly speaking), and those who study the fine arts. Law, medicine and engineering are included to the extent that they are still understood as “learned professions,” deploying practical skills that are nonetheless deeply rooted in scientific knowledge or humanistic understanding
  • When, as is often the case in business education and teacher training, practical skills far outweigh theoretical understanding, we are moving beyond the intellectual culture that defines higher education
  • Our support for higher education makes sense only if we regard this intellectual culture as essential to our society
  • This has important consequences for how we regard what goes on in college classrooms.  Teachers need to see themselves as, first of all, intellectuals, dedicated to understanding poetry, history, human psychology, physics, biology — or whatever is the focus of their discipline.  But they also need to realize that this dedication expresses not just their idiosyncratic interest in certain questions but a conviction that those questions have general human significance, even apart from immediately practical applications.  This is why a discipline requires not just research but also teaching
  • Students, in turn, need to recognize that their college education is above all a matter of opening themselves up to new dimensions of knowledge and understanding.  Teaching is not a matter of (as we too often say) “making a subject (poetry, physics, philosophy) interesting” to students but of students coming to see how such subjects are intrinsically interesting.  It is more a matter of students moving beyond their interests than of teachers fitting their subjects to interests that students already have.   Good teaching does not make a course’s subject more interesting; it gives the students more interests — and so makes them more interesting.
« First ‹ Previous 41 - 60 of 117 Next › Last »
Showing 20 items per page