Skip to main content

Home/ TOK Friends/ Group items tagged aging

Rss Feed Group items tagged

Javier E

[Six Questions] | Astra Taylor on The People's Platform: Taking Back Power and Culture ... - 1 views

  • Astra Taylor, a cultural critic and the director of the documentaries Zizek! and Examined Life, challenges the notion that the Internet has brought us into an age of cultural democracy. While some have hailed the medium as a platform for diverse voices and the free exchange of information and ideas, Taylor shows that these assumptions are suspect at best. Instead, she argues, the new cultural order looks much like the old: big voices overshadow small ones, content is sensationalist and powered by advertisements, quality work is underfunded, and corporate giants like Google and Facebook rule. The Internet does offer promising tools, Taylor writes, but a cultural democracy will be born only if we work collaboratively to develop the potential of this powerful resource
  • Most people don’t realize how little information can be conveyed in a feature film. The transcripts of both of my movies are probably equivalent in length to a Harper’s cover story.
  • why should Amazon, Apple, Facebook, and Google get a free pass? Why should we expect them to behave any differently over the long term? The tradition of progressive media criticism that came out of the Frankfurt School, not to mention the basic concept of political economy (looking at the way business interests shape the cultural landscape), was nowhere to be seen, and that worried me. It’s not like political economy became irrelevant the second the Internet was invented.
  • ...15 more annotations...
  • How do we reconcile our enjoyment of social media even as we understand that the corporations who control them aren’t always acting in our best interests?
  • hat was because the underlying economic conditions hadn’t been changed or “disrupted,” to use a favorite Silicon Valley phrase. Google has to serve its shareholders, just like NBCUniversal does. As a result, many of the unappealing aspects of the legacy-media model have simply carried over into a digital age — namely, commercialism, consolidation, and centralization. In fact, the new system is even more dependent on advertising dollars than the one that preceded it, and digital advertising is far more invasive and ubiquitous
  • the popular narrative — new communications technologies would topple the establishment and empower regular people — didn’t accurately capture reality. Something more complex and predictable was happening. The old-media dinosaurs weren’t dying out, but were adapting to the online environment; meanwhile the new tech titans were coming increasingly to resemble their predecessors
  • I use lots of products that are created by companies whose business practices I object to and that don’t act in my best interests, or the best interests of workers or the environment — we all do, since that’s part of living under capitalism. That said, I refuse to invest so much in any platform that I can’t quit without remorse
  • these services aren’t free even if we don’t pay money for them; we pay with our personal data, with our privacy. This feeds into the larger surveillance debate, since government snooping piggybacks on corporate data collection. As I argue in the book, there are also negative cultural consequences (e.g., when advertisers are paying the tab we get more of the kind of culture marketers like to associate themselves with and less of the stuff they don’t) and worrying social costs. For example, the White House and the Federal Trade Commission have both recently warned that the era of “big data” opens new avenues of discrimination and may erode hard-won consumer protections.
  • I’m resistant to the tendency to place this responsibility solely on the shoulders of users. Gadgets and platforms are designed to be addictive, with every element from color schemes to headlines carefully tested to maximize clickability and engagement. The recent news that Facebook tweaked its algorithms for a week in 2012, showing hundreds of thousands of users only “happy” or “sad” posts in order to study emotional contagion — in other words, to manipulate people’s mental states — is further evidence that these platforms are not neutral. In the end, Facebook wants us to feel the emotion of wanting to visit Facebook frequently
  • social inequalities that exist in the real world remain meaningful online. What are the particular dangers of discrimination on the Internet?
  • That it’s invisible or at least harder to track and prove. We haven’t figured out how to deal with the unique ways prejudice plays out over digital channels, and that’s partly because some folks can’t accept the fact that discrimination persists online. (After all, there is no sign on the door that reads Minorities Not Allowed.)
  • just because the Internet is open doesn’t mean it’s equal; offline hierarchies carry over to the online world and are even amplified there. For the past year or so, there has been a lively discussion taking place about the disproportionate and often outrageous sexual harassment women face simply for entering virtual space and asserting themselves there — research verifies that female Internet users are dramatically more likely to be threatened or stalked than their male counterparts — and yet there is very little agreement about what, if anything, can be done to address the problem.
  • What steps can we take to encourage better representation of independent and non-commercial media? We need to fund it, first and foremost. As individuals this means paying for the stuff we believe in and want to see thrive. But I don’t think enlightened consumption can get us where we need to go on its own. I’m skeptical of the idea that we can shop our way to a better world. The dominance of commercial media is a social and political problem that demands a collective solution, so I make an argument for state funding and propose a reconceptualization of public media. More generally, I’m struck by the fact that we use these civic-minded metaphors, calling Google Books a “library” or Twitter a “town square” — or even calling social media “social” — but real public options are off the table, at least in the United States. We hand the digital commons over to private corporations at our peril.
  • 6. You advocate for greater government regulation of the Internet. Why is this important?
  • I’m for regulating specific things, like Internet access, which is what the fight for net neutrality is ultimately about. We also need stronger privacy protections and restrictions on data gathering, retention, and use, which won’t happen without a fight.
  • I challenge the techno-libertarian insistence that the government has no productive role to play and that it needs to keep its hands off the Internet for fear that it will be “broken.” The Internet and personal computing as we know them wouldn’t exist without state investment and innovation, so let’s be real.
  • there’s a pervasive and ill-advised faith that technology will promote competition if left to its own devices (“competition is a click away,” tech executives like to say), but that’s not true for a variety of reasons. The paradox of our current media landscape is this: our devices and consumption patterns are ever more personalized, yet we’re simultaneously connected to this immense, opaque, centralized infrastructure. We’re all dependent on a handful of firms that are effectively monopolies — from Time Warner and Comcast on up to Google and Facebook — and we’re seeing increased vertical integration, with companies acting as both distributors and creators of content. Amazon aspires to be the bookstore, the bookshelf, and the book. Google isn’t just a search engine, a popular browser, and an operating system; it also invests in original content
  • So it’s not that the Internet needs to be regulated but that these big tech corporations need to be subject to governmental oversight. After all, they are reaching farther and farther into our intimate lives. They’re watching us. Someone should be watching them.
sissij

Science in the Age of Alternative Facts | Big Think - 0 views

  • discovered a peculiar aspect of human psychology and physiology: the placebo effect. As biographer Richard Holmes writes regarding their increased health, “It was simply because the patients believed they would be cured.”
  • Most importantly he did not finagle results to fit his preconceived notion of what this and other gases accomplish.
  • For science to work we need to move out of the way of ourselves and observe the data. Right now too many emotionally stunted and corporate-backed obstacles stand in the way of that.
  •  
    Alternative facts are spooky things that confuse us between what we think is happening and what's really taking place. In this article, the author uses the example of Davy to suggest that treating the data objectively is what science should be doing. Data is tricky in science because we can draw different conclusions from the same set of data. Just like the line drawing game we played in TOK, there are infinite numbers of lines we can draw to connect all the data point, but there is only one that would be true. As the author shown in this article, the best way to avoid creating alternative facts is to leave out our emotion and personal opinion and let the data speak. Although intuition and imagination is good for science, but for most of the times, we need to remind ourselves not to force the data.
dicindioha

How Do Kids See the World on a Family Trip? - The New York Times - 0 views

  • We gave six families 360-degree video cameras to show us a trip through the eyes of a child.
  • Dr. Klass talked about the perspective of children from the ages of 3 to 15 and how families can better understand what experiences would be the most compelling to them.
  • “An older child might accept the challenge to find the self-portrait,” she said, “but a younger child might be more interested in the question of who is and who isn’t wearing underwear.”
  • ...4 more annotations...
  • “One of the things you want to do as a parent traveling is, look for places where you can interact a little more with objects and push the buttons,” Dr. Klass said.
  • “You can bring a bunch of 7-year-olds into a room, and some will be drawn to sports equipment, some drawn to animals. They have really distinct interests you’ll want to play to,”
  • “They haven’t yet decided that anything you’re interested in may not be the coolest thing. And they probably still want to know what you are excited about,”
  • As for teenagers, they may want to lead the way, and they may show you things about the world you never knew.”
  •  
    This article is really unique as it compares the way different ages view the world. Earlier in the year we learned about how our perspective is greatly based upon former experiences. Young kids want to interact with whatever they can hands on, and teenages may eventually not want to travel with their parents; it depends partly on what has happened before in their life.
Javier E

The Older Mind May Just Be a Fuller Mind - NYTimes.com - 0 views

  • Memory’s speed and accuracy begin to slip around age 25 and keep on slipping.
  • Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing
  • Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared.
  • ...6 more annotations...
  • Neuroscientists have some reason to believe that neural processing speed, like many reflexes, slows over the years; anatomical studies suggest that the brain also undergoes subtle structural changes that could affect memory.
  • doubts about the average extent of the decline are rooted not in individual differences but in study methodology. Many studies comparing older and younger people, for instance, did not take into account the effects of pre-symptomatic Alzheimer’s disease,
  • The new data-mining analysis also raises questions about many of the measures scientists use. Dr. Ramscar and his colleagues applied leading learning models to an estimated pool of words and phrases that an educated 70-year-old would have seen, and another pool suitable for an educated 20-year-old. Their model accounted for more than 75 percent of the difference in scores between older and younger adults on items in a paired-associate test
  • That is to say, the larger the library you have in your head, the longer it usually takes to find a particular word (or pair).
  • Scientists who study thinking and memory often make a broad distinction between “fluid” and “crystallized” intelligence. The former includes short-term memory, like holding a phone number in mind, analytical reasoning, and the ability to tune out distractions, like ambient conversation. The latter is accumulated knowledge, vocabulary and expertise.
  • an increase in crystallized intelligence can account for a decrease in fluid intelligence,
Javier E

Our Biased Brains - NYTimes.com - 0 views

  • The human brain seems to be wired so that it categorizes people by race in the first one-fifth of a second after seeing a face
  • Racial bias also begins astonishingly early: Even infants often show a preference for their own racial group. In one study, 3-month-old white infants were shown photos of faces of white adults and black adults; they preferred the faces of whites. For 3-month-old black infants living in Africa, it was the reverse.
  • in evolutionary times we became hard-wired to make instantaneous judgments about whether someone is in our “in group” or not — because that could be lifesaving. A child who didn’t prefer his or her own group might have been at risk of being clubbed to death.
  • ...7 more annotations...
  • I encourage you to test yourself at implicit.harvard.edu. It’s sobering to discover that whatever you believe intellectually, you’re biased about race, gender, age or disability.
  • unconscious racial bias turns up in children as soon as they have the verbal skills to be tested for it, at about age 4. The degree of unconscious bias then seems pretty constant: In tests, this unconscious bias turns out to be roughly the same for a 4- or 6-year-old as for a senior citizen who grew up in more racially oppressive times.
  • Many of these experiments on in-group bias have been conducted around the world, and almost every ethnic group shows a bias favoring its own. One exception: African-Americans.
  • in contrast to other groups, African-Americans do not have an unconscious bias toward their own. From young children to adults, they are essentially neutral and favor neither whites nor blacks.
  • even if we humans have evolved to have a penchant for racial preferences from a very young age, this is not destiny. We can resist the legacy that evolution has bequeathed us.
  • “We wouldn’t have survived if our ancestors hadn’t developed bodies that store sugar and fat,” Banaji says. “What made them survive is what kills us.” Yet we fight the battle of the bulge and sometimes win — and, likewise, we can resist a predisposition for bias against other groups.
  • Deep friendships, especially romantic relationships with someone of another race, also seem to mute bias
summertyler

Is It Ordinary Memory Loss, or Alzheimer's Disease? - NYTimes.com - 0 views

  • worried about her memory, wondering if she could have the beginnings of dementia
  • no more difficulty than the rest of us her age in remembering events, names and places, her physician suggested that, given her level of concern, she should have things checked out
  • two days of tests of her cognitive abilities
  • ...7 more annotations...
  • The result: reassurance and relief. Everything was in the normal range for her age, and she registered as superior on the ability to perform tasks and solve problems.
  • Simple tests done in eight to 12 minutes in a doctor’s office can determine whether memory issues are normal for one’s age or are problematic and warrant a more thorough evaluation.
  • more than half of older adults with signs of memory loss never see a doctor about it
  • “Early evaluation and identification of people with dementia may help them receive care earlier,”
  • “It can help families make plans for care, help with day-to-day tasks, including medication administration, and watch for future problems that can occur.”
  • Both tests measure orientation to time, date and place; attention and concentration; ability to calculate; memory; language; and conceptual thinking.
  • its score can be skewed by a person’s level of education, cultural background, a learning or speech disorder, and language fluency
  •  
    Memory loss is difficult to understand because of the many factors that affect it.
jongardner04

To Improve a Memory, Consider Chocolate - NYTimes.com - 1 views

  •  
    Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age.
  •  
    Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age.
Javier E

How the Internet Gets Inside Us : The New Yorker - 0 views

  • It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with
  • The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare.
  • Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness.
  • ...14 more annotations...
  • In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.
  • “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”
  • when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix.
  • The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965.
  • If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.
  • Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began.
  • Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.
  • The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points.
  • at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.
  • What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own.
  • A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them.
  • And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.
  • Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user.
  • the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.
Javier E

College for Grown-Ups - NYTimes.com - 0 views

  • If we were starting from zero, we probably wouldn’t design colleges as age-segregated playgrounds in which teenagers and very young adults are given free rein to spend their time more or less as they choose. Yet this is the reality.
  • Rethinking the expectation that applicants to selective colleges be fresh out of high school would go far in reducing risk for young people while better protecting everyone’s college investment. Some of this rethinking is already underway. Temporarily delaying college for a year or two after high school is now becoming respectable among the admissions gatekeepers at top schools. Massive online open courses (MOOCs) and other forms of online learning make it possible to experience fragments of an elite education at little or no cost.
  • people are tinkering further with conventional campus models. The Minerva Project, a San Francisco start-up with offices two blocks from Twitter, offers classic seminar-style college courses via a sophisticated interactive online learning platform and accompanies them with residencies in cities all over the world. Nearby in the SoMa district, Dev Bootcamp, a 19-week immersive curriculum that trains people of all ages for jobs in the tech industry, is a popular alternative. Some successfully employed graduates brag of bypassing college altogether.
  • ...2 more annotations...
  • At Stanford, where I teach, an idea still in the concept phase developed by a student-led team in the university’s Hasso Plattner Institute of Design calls for the replacement of four consecutive college years in young adulthood with multiple residencies distributed over a lifetime. What the designers call Open Loop University would grant students admitted to Stanford multiple years to spend on campus, along with advisers to help them time those years strategically in light of their personal development and career ambitions. Today’s arbitrarily segregated world of teenagers and young adults would become an ever-replenished intergenerational community of purposeful learners.
  • the status quo is not sustainable. Unrelenting demand for better-educated workers, rapidly developing technological capacity to support learning digitally and the soaring costs of conventional campus life are driving us toward substantial change.
Javier E

Ann Coulter Is Right to Fear the World Cup - Peter Beinart - The Atlantic - 1 views

  • Ann Coulter penned a column explaining why soccer is un-American. First, it’s collectivist. (“Individual achievement is not a big factor…blame is dispersed.”) Second, it’s effeminate. (“It’s a sport in which athletic talent finds so little expression that girls can play with boys.”) Third, it’s culturally elitist. (“The same people trying to push soccer on Americans are the ones demanding that we love HBO’s “Girls,” light-rail, Beyoncé and Hillary Clinton.”) Fourth, and most importantly, “It’s foreign…Soccer is like the metric system, which liberals also adore because it’s European.”
  • Soccer hatred, in other words, exemplifies American exceptionalism.
  • For Coulter and many contemporary conservatives, by contrast, part of what makes America exceptional is its individualism, manliness and populism
  • ...22 more annotations...
  • Coulter’s deeper point is that for America to truly be America, it must stand apart
  • The core problem with embracing soccer is that in so doing, America would become more like the rest of the world.
  • America’s own league, Major League Soccer, draws as many fans to its stadiums as do the NHL and NBA.
  • I wrote an essay entitled “The End of American Exceptionalism,” which argued that on subjects where the United States has long been seen as different, attitudes in America increasingly resemble those in Europe. Soccer is one of the best examples yet.
  • “Soccer,” Markovits and Hellerman argue, “was perceived by both native-born Americans and immigrants as a non-American activity at a time in American history when nativism and nationalism emerged to create a distinctly American self-image … if one liked soccer, one was viewed as at least resisting—if not outright rejecting—integration into America.”
  • The average age of Americans who call baseball their favorite sport is 53. Among Americans who like football best, it’s 46. Among Americans who prefer soccer, by contrast, the average age is only 37.
  • Old-stock Americans, in other words, were elevating baseball, football, and basketball into symbols of America’s distinct identity. Immigrants realized that embracing those sports offered a way to claim that identity for themselves. Clinging to soccer, by contrast, was a declaration that you would not melt.
  • why is interest in soccer rising now? Partly, because the United States is yet again witnessing mass immigration from soccer-mad nations.
  • the key shift is that America’s sports culture is less nativist. More native-born Americans now accept that a game invented overseas can become authentically American, and that the immigrants who love it can become authentically American too. Fewer believe that to have merit, something must be invented in the United States.
  • why didn’t soccer gain a foothold in the U.S. in the decades between the Civil War and World War I, when it was gaining dominance in Europe? Precisely because it was gaining dominance in Europe. The arbiters of taste in late 19th and early 20th century America wanted its national pastimes to be exceptional.
  • Americans over the age of 50 were 15 points more likely to say “our culture is superior” than were people over 50 in Germany, Spain, Britain, and France
  • Americans under 30, by contrast, were actually less likely to say “our culture is superior” than their counterparts in Germany, Spain, and Britain.
  • Americans today are less likely to insist that America’s way of doing things is always best. In 2002, 60 percent of Americans told the Pew Research Center that, “our culture is superior to others.” By 2011, it was down to 49 percent.
  • the third major pro-soccer constituency is liberals. They’re willing to embrace a European sport for the same reason they’re willing to embrace a European-style health care system: because they see no inherent value in America being an exception to the global rule
  • When the real-estate website Estately created a seven part index to determine a state’s love of soccer, it found that Washington State, Maryland, the District of Columbia, New York, and New Jersey—all bright blue—loved soccer best, while Alabama, Arkansas, North Dakota, Mississippi and Montana—all bright red—liked it least.
  • the soccer coalition—immigrants, liberals and the young—looks a lot like the Obama coalition.
  • Sports-wise, therefore, Democrats constitute an alliance between soccer and basketball fans while Republicans disproportionately follow baseball, golf, and NASCAR. Football, by far America’s most popular sport, crosses the aisle.)
  • The willingness of growing numbers of Americans to embrace soccer bespeaks their willingness to imagine a different relationship with the world. Historically, conservative foreign policy has oscillated between isolationism and imperialism. America must either retreat from the world or master it. It cannot be one among equals, bound by the same rules as everyone else
  • Exceptionalists view sports the same way. Coulter likes football, baseball, and basketball because America either plays them by itself, or—when other countries play against us—we dominate them.
  • Embracing soccer, by contrast, means embracing America’s role as merely one nation among many, without special privileges. It’s no coincidence that young Americans, in addition to liking soccer, also like the United Nations. In 2013, Pew found that Americans under 30 were 24 points more favorable to the U.N. than Americans over 50.
  • Millennials were also 23 points more likely than the elderly to say America should take its allies’ opinion into account even if means compromising our desires.
  • In embracing soccer, Americans are learning to take something we neither invented nor control, and nonetheless make it our own. It’s a skill we’re going to need in the years to come.
Javier E

How to Raise a University's Profile: Pricing and Packaging - NYTimes.com - 0 views

  • I talked to a half-dozen of Hugh Moren’s fellow students. A highly indebted senior who was terrified of the weak job market described George Washington, where he had invested considerable time getting and doing internships, as “the world’s most expensive trade school.” Another mentioned the abundance of rich students whose parents were giving them a fancy-sounding diploma the way they might a new car. There are serious students here, he acknowledged, but: “You can go to G.W. and essentially buy a degree.”
  • A recent study from the Organization for Economic Cooperation and Development found that, on average, American college graduates score well below college graduates from most other industrialized countries in mathematics. In literacy (“understanding, evaluating, using and engaging with written text”), scores are just average. This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students.Instead of focusing on undergraduate learning, nu
  • colleges have been engaged in the kind of building spree I saw at George Washington. Recreation centers with world-class workout facilities and lazy rivers rise out of construction pits even as students and parents are handed staggeringly large tuition bills. Colleges compete to hire famous professors even as undergraduates wander through academic programs that often lack rigor or coherence. Campuses vie to become the next Harvard — or at least the next George Washington — while ignoring the growing cost and suspect quality of undergraduate education.
  • ...58 more annotations...
  • Mr. Trachtenberg understood the centrality of the university as a physical place. New structures were a visceral sign of progress. They told visitors, donors and civic leaders that the institution was, like beams and scaffolding rising from the earth, ascending. He added new programs, recruited more students, and followed the dictate of constant expansion.
  • the American research university had evolved into a complicated and somewhat peculiar organization. It was built to be all things to all people: to teach undergraduates, produce knowledge, socialize young men and women, train workers for jobs, anchor local economies, even put on weekend sports events. And excellence was defined by similarity to old, elite institutions. Universities were judged by the quality of their scholars, the size of their endowments, the beauty of their buildings and the test scores of their incoming students.
  • John Silber embarked on a huge building campaign while bringing luminaries like Saul Bellow and Elie Wiesel on board to teach and lend their prestige to the B.U. name, creating a bigger, more famous and much more costly institution. He had helped write a game plan for the aspiring college president.
  • GWU is, for all intents and purposes, a for-profit organization. Best example: study abroad. Their top program, a partnering with Sciences Po, costs each student (30 of them, on a program with 'prestige' status?) a full semester's tuition. It costs GW, according to Sciences Po website, €1000. A neat $20,000 profit per student (who is in digging her/himself deeper and deeper in debt.) Moreover, the school takes a $500 admin fee for the study abroad application! With no guarantee that all credits transfer. Students often lose a partial semester, GW profits again. Nor does GW offer help with an antiquated, one-shot/no transfers, tricky registration process. It's tough luck in gay Paris.Just one of many examples. Dorms with extreme mold, off-campus housing impossible for freshmen and sophomores. Required meal plan: Chick-o-Filet etc. Classes with over 300 students (required).This is not Harvard, but costs same.Emotional problems? Counselors too few. Suicides continue and are not appropriately addressed. Caring environment? Extension so and so, please hold.It's an impressive campus, I'm an alum. If you apply, make sure the DC experience is worth the price: good are internships, a few colleges like Elliot School, post-grad.GWU uses undergrad $$ directly for building projects, like the medical center to which students have NO access. (Student health facility is underfunded, outsourced.)Outstanding professors still make a difference. But is that enough?
  • Mr. Trachtenberg, however, understood something crucial about the modern university. It had come to inhabit a market for luxury goods. People don’t buy Gucci bags merely for their beauty and functionality. They buy them because other people will know they can afford the price of purchase. The great virtue of a luxury good, from the manufacturer’s standpoint, isn’t just that people will pay extra money for the feeling associated with a name brand. It’s that the high price is, in and of itself, a crucial part of what people are buying.
  • Mr. Trachtenberg convinced people that George Washington was worth a lot more money by charging a lot more money. Unlike most college presidents, he was surprisingly candid about his strategy. College is like vodka, he liked to explain.
  • The Absolut Rolex plan worked. The number of applicants surged from some 6,000 to 20,000, the average SAT score of students rose by nearly 200 points, and the endowment jumped from $200 million to almost $1 billion.
  • The university became a magnet for the children of new money who didn’t quite have the SATs or family connections required for admission to Stanford or Yale. It also aggressively recruited international students, rich families from Asia and the Middle East who believed, as nearly everyone did, that American universities were the best in the world.
  • U.S. News & World Report now ranks the university at No. 54 nationwide, just outside the “first tier.”
  • The watch and vodka analogies are correct. Personally, I used car analogies when discussing college choices with my kids. We were in the fortunate position of being able to comfortably send our kids to any college in the country and have them leave debt free. Notwithstanding, I told them that they would be going to a state school unless they were able to get into one of about 40 schools that I felt, in whatever arbitrary manner I decided, that was worth the extra cost. They both ended up going to state schools.College is by and large a commodity and you get out of it what you put into it. Both of my kids worked hard in college and were involved in school life. They both left the schools better people and the schools better schools for them being there. They are both now successful adults.I believe too many people look for the prestige of a named school and that is not what college should be primarily about.
  • In 2013, only 14 percent of the university’s 10,000 undergraduates received a grant — a figure on a par with elite schools but far below the national average. The average undergraduate borrower leaves with about $30,800 in debt.
  • When I talk to the best high school students in my state I always stress the benefits of the honors college experience at an affordable public university. For students who won't qualify for a public honors college. the regular pubic university experience is far preferable to the huge debt of places like GW.
  • Carey would do well to look beyond high ticket private universities (which after all are still private enterprises) and what he describes as the Olympian heights of higher education (which for some reason seems also to embitter him) and look at the system overall . The withdrawal of public support was never a policy choice; it was a political choice, "packaged and branded" as some tax cutting palaver all wrapped up in the argument that a free-market should decide how much college should cost and how many seats we need. In such an environment, trustees at private universities are no more solely responsible for turning their degrees into commodities than the administrations of state universities are for raising the number of out-of-state students in order to offset the loss of support from their legislatures. No doubt, we will hear more about market based solutions and technology from Mr. Carey
  • I went to GW back in the 60s. It was affordable and it got me away from home in New York. While I was there, Newsweek famously published a article about the DC Universities - GW, Georgetown, American and Catholic - dubbing them the Pony league, the schools for the children of wealthy middle class New Yorkers who couldn't get into the Ivy League. Nobody really complained. But that wasn't me. I went because I wanted to be where the action was in the 60s, and as we used to say - "GW was literally a stone's throw from the White House. And we could prove it." Back then, the two biggest alumni names were Jackie Kennedy, who's taken some classes there, and J. Edgar Hoover. Now, according to the glossy magazine they send me each month, it's the actress Kerry Washington. There's some sort of progress there, but I'm a GW alum and not properly trained to understand it.
  • This explains a lot of the modern, emerging mentality. It encompasses the culture of enforced grade inflation, cheating and anti-intellectualism in much of higher education. It is consistent with our culture of misleading statistics and information, cronyism and fake quality, the "best and the brightest" being only schemers and glad handers. The wisdom and creativity engendered by an honest, rigorous academic education are replaced by the disingenuous quick fix, the winner-take-all mentality that neglects the common good.
  • I attended nearby Georgetown University and graduated in 1985. Relative to state schools and elite schools, it was expensive then. I took out loans. I had Pell grants. I had work-study and GSL. I paid my debt of $15,000 off in ten years. Would I have done it differently? Yes: I would have continued on to graduate school and not worried about paying off those big loans right after college. My career work out and I am grateful for the education I received and paid for. But I would not recommend to my nieces and nephews debts north of $100,000 for a BA in liberal arts. Go community. Then go state. Then punch your ticket to Harvard, Yale or Stanford — if you are good enough.
  • American universities appear to have more and more drifted away from educating individuals and citizens to becoming high priced trade schools and purveyors of occupational licenses. Lost in the process is the concept of expanding a student's ability to appreciate broadly and deeply, as well as the belief that a republican democracy needs an educated citizenry, not a trained citizenry, to function well.Both the Heisman Trophy winner and the producer of a successful tech I.P.O. likely have much in common, a college education whose rewards are limited to the financial. I don't know if I find this more sad on the individual level or more worrisome for the future of America.
  • This is now a consumer world for everything, including institutions once thought to float above the Shakespearean briars of the work-a-day world such as higher education, law and medicine. Students get this. Parents get this. Everything is negotiable: financial aid, a spot in the nicest dorm, tix to the big game. But through all this, there are faculty - lots of 'em - who work away from the fluff to link the ambitions of the students with the reality and rigor of the 21st century. The job of the student is to get beyond the visible hype of the surroundings and find those faculty members. They will make sure your investment is worth it
  • My experience in managing or working with GW alumni in their 20's or 30's has not been good. Virtually all have been mentally lazy and/or had a stunning sense of entitlement. Basically they've been all talk and no results. That's been quite a contrast to the graduates from VA/MD state universities.
  • More and more, I notice what my debt-financed contributions to the revenue streams of my vendors earn them, not me. My banks earned enough to pay ridiculous bonuses to employees for reckless risk-taking. My satellite tv operator earned enough to overpay ESPN for sports programming that I never watch--and that, in turn, overpays these idiotic pro athletes and college sports administrators. My health insurer earned enough to defeat one-payor insurance; to enable the opaque, inefficient billing practices of hospitals and other providers; and to feed the behemoth pharmaceutical industry. My church earned enough to buy the silence of sex abuse victims and oppose progressive political candidates. And my govt earned enough to continue ag subsidies, inefficient defense spending, and obsolete transportation and energy policies.
  • as the parent of GWU freshman I am grateful for every opportunity afforded her. She has a generous merit scholarship, is in the honors program with some small classes, and has access to internships that can be done while at school. GWU also gave her AP credits to advance her to sophomore status. Had she attended the state flagship school (where she was accepted into that exclusive honors program) she would have a great education but little else. It's not possible to do foreign affairs related internship far from D.C. or Manhattan. She went to a very competitive high school where for the one or two ivy league schools in which she was interested, she didn't have the same level of connections or wealth as many of her peers. Whether because of the Common Application or other factors, getting into a good school with financial help is difficult for a middle class student like my daughter who had a 4.0 GPA and 2300 on the SAT. She also worked after school.The bottom line - GWU offered more money than perceived "higher tier" universities, and brought tuition to almost that of our state school system. And by the way, I think she is also getting a very good education.
  • This article reinforces something I have learned during my daughter's college application process. Most students choose a school based on emotion (reputation) and not value. This luxury good analogy holds up.
  • The entire education problem can be solved by MOOCs lots and lots of them plus a few closely monitored tests and personal interviews with people. Of course many many people make MONEY off of our entirely inefficient way of "educating" -- are we even really doing that -- getting a degree does NOT mean one is actually educated
  • As a first-generation college graduate I entered GW ambitious but left saddled with debt, and crestfallen at the hard-hitting realization that my four undergraduate years were an aberration from what life is actually like post-college: not as simple as getting an [unpaid] internship with a fancy titled institution, as most Colonials do. I knew how to get in to college, but what do you do after the recess of life ends?I learned more about networking, resume plumping (designated responses to constituents...errr....replied to emails), and elevator pitches than actual theory, economic principles, strong writing skills, critical thinking, analysis, and philosophy. While relatively easy to get a job after graduating (for many with a GW degree this is sadly not the case) sustaining one and excelling in it is much harder. It's never enough just to be able to open a new door, you also need to be prepared to navigate your way through that next opportunity.
  • this is a very telling article. Aimless and directionless high school graduates are matched only by aimless and directionless institutes of higher learning. Each child and each parent should start with a goal - before handing over their hard earned tuition dollars, and/or leaving a trail of broken debt in the aftermath of a substandard, unfocused education.
  • it is no longer the most expensive university in America. It is the 46th.Others have been implementing the Absolut Rolex Plan. John Sexton turned New York University into a global higher-education player by selling the dream of downtown living to students raised on “Sex and the City.” Northeastern followed Boston University up the ladder. Under Steven B. Sample, the University of Southern California became a U.S. News top-25 university. Washington University in St. Louis did the same.
  • I currently attend GW, and I have to say, this article completely misrepresents the situation. I have yet to meet a single person who is paying the full $60k tuition - I myself am paying $30k, because the school gave me $30k in grants. As for the quality of education, Foreign Policy rated GW the #8 best school in the world for undergraduate education in international affairs, Princeton Review ranks it as one of the best schools for political science, and U.S. News ranks the law school #20. The author also ignores the role that an expanding research profile plays in growing a university's prestige and educational power.
  • And in hundreds of regional universities and community colleges, presidents and deans and department chairmen have watched this spectacle of ascension and said to themselves, “That could be me.” Agricultural schools and technical institutes are lobbying state legislatures for tuition increases and Ph.D. programs, fitness centers and arenas for sport. Presidents and boards are drawing up plans to raise tuition, recruit “better” students and add academic programs. They all want to go in one direction — up! — and they are all moving with a single vision of what they want to be.
  • this is the same playbook used by hospitals the past 30 years or so. It is how Hackensack Hospital became Hackensack Medical Center and McComb Hospital became Southwest Mississippi Regional Medical Center. No wonder the results have been the same in healthcare and higher education; both have priced themselves out of reach for average Americans.
  • a world where a college is rated not by the quality of its output, but instaed, by the quality of its inputs. A world where there is practically no work to be done by the administration because the college's reputation is made before the first class even begins! This is isanity! But this is the swill that the mammoth college marketing departments nationwide have shoved down America's throat. Colleges are ranked not by the quality of their graduates, but rather, by the test scores of their incoming students!
  • The Pew Foundation has been doing surveys on what students learn, how much homework they do, how much time they spend with professors etc. All good stuff to know before a student chooses a school. It is called the National Survey of Student Engagement (NSSE - called Nessy). It turns out that the higher ranked schools do NOT allow their information to be released to the public. It is SECRET.Why do you think that is?
  • The article blames "the standard university organizational model left teaching responsibilities to autonomous academic departments and individual faculty members, each of which taught and tested in its own way." This is the view of someone who has never taught at a university, nor thought much about how education there actually happens. Once undergraduates get beyond the general requirements, their educations _have_ to depend on "autonomous departments" because it's only those departments know what the requirements for given degree can be, and can grant the necessary accreditation of a given student. The idea that some administrator could know what's necessary for degrees in everything from engineering to fiction writing is nonsense, except that's what the people who only know the theory of education (but not its practice) actually seem to think. In the classroom itself, you have tremendously talented people, who nevertheless have their own particular strengths and approaches. Don't you think it's a good idea to let them do what they do best rather than trying to make everyone teach the same way? Don't you think supervision of young teachers by older colleagues, who actually know their field and its pedagogy, rather than some administrator, who knows nothing of the subject, is a good idea?
  • it makes me very sad to see how expensive some public schools have become. Used to be you could work your way through a public school without loans, but not any more. Like you, I had the advantage of a largely-scholarship paid undergraduate education at a top private college. However, I was also offered a virtually free spot in my state university's (then new) honors college
  • My daughter attended a good community college for a couple of classes during her senior year of high school and I could immediately see how such places are laboratories for failure. They seem like high schools in atmosphere and appearance. Students rush in by car and rush out again when the class is over.The four year residency college creates a completely different feel. On arrival, you get the sense that you are engaging in something important, something apart and one that will require your full attention. I don't say this is for everyone or that the model is not flawed in some ways (students actually only spend 2 1/2 yrs. on campus to get the four yr. degree). College is supposed to be a 60 hour per week job. Anything less than that and the student is seeking himself or herself
  • This. Is. STUNNING. I have always wondered, especially as my kids have approached college age, why American colleges have felt justified in raising tuition at a rate that has well exceeded inflation, year after year after year. (Nobody needs a dorm with luxury suites and a lazy river pool at college!) And as it turns out, they did it to become luxury brands. Just that simple. Incredible.I don't even blame this guy at GWU for doing what he did. He wasn't made responsible for all of American higher ed. But I do think we all need to realize what happened, and why. This is front page stuff.
  • I agree with you, but, unfortunately, given the choice between low tuition, primitive dorms, and no athletic center VS expensive & luxurious, the customers (and their parents) are choosing the latter. As long as this is the case, there is little incentive to provide bare-bones and cheap education.
  • Wesleyan University in CT is one school that is moving down the rankings. Syracuse University is another. Reed College is a third. Why? Because these schools try hard to stay out of the marketing game. (With its new president, Syracuse has jumped back into the game.) Bryn Mawr College, outside Philadelphia hasn't fared well over the past few decades in the rankings, which is true of practically every women's college. Wellesley is by far the highest ranked women's college, but even there the acceptance rate is significantly higher than one finds at comparable coed liberal arts colleges like Amherst & Williams. University of Chicago is another fascinating case for Mr. Carey to study (I'm sure he does in his forthcoming book, which I look forward to reading). Although it has always enjoyed an illustrious academic reputation, until recently Chicago's undergraduate reputation paled in comparison to peer institutions on the two coasts. A few years ago, Chicago changed its game plan to more closely resemble Harvard and Stanford in undergraduate amenities, and lo and behold, its rankings shot up. It was a very cynical move on the president's part to reassemble the football team, but it was a shrewd move because athletics draw more money than academics ever can (except at engineering schools like Cal Tech & MIT), and more money draws richer students from fancier secondary schools with higher test scores, which lead to higher rankings - and the beat goes on.
  • College INDUSTRY is out of control. Sorry, NYU, GW, BU are not worth the price. Are state schools any better? We have the University of Michigan, which is really not a state school, but a university that gives a discount to people who live in Michigan. Why? When you have an undergraduate body 40+% out-of-state that pays tuition of over $50K/year, you tell me?Perhaps the solution is two years of community college followed by two at places like U of M or Michigan State - get the same diploma at the end for much less and beat the system.
  • In one recent yr., the majority of undergrad professors at Harvard, according to Boston.com, where adjuncts. That means low pay, no benefits, no office, temp workers. Harvard.Easily available student loans fueled this arms race of amenities and frills that in which colleges now engage. They moved the cost of education onto the backs of people, kids, who don't understand what they are doing.Students in colleges these days are customers and the customers must be able to get through. If it requires dumbing things down, so be it. On top of tuition, G.W. U. is known by its students as the land of added fees on top of added fees. The joke around campus was that they would soon be installing pay toilets in the student union. No one was laughing.
  • You could written the same story about my alma mater, American University. The place reeked of ambition and upward mobility decades ago and still does. Whoever's running it now must look at its measly half-billion-dollar endowment and compare it to GWU's $1.5 billion and seethe with envy, while GWU's president sets his sights on an Ivy League-size endowment. And both get back to their real jobs: 24/7 fundraising,Which is what university presidents are all about these days. Money - including million-dollar salaries for themselves (GWU's president made more than Harvard's in 2011) - pride, cachet, power, a mansion, first-class all the way. They should just be honest about it and change their university's motto to Ostende mihi pecuniam! (please excuse my questionable Latin)Whether the students are actually learning anything is up to them, I guess - if they do, it's thanks to the professors, adjuncts and the administrative staff, who do the actual work of educating and keep the school running.
  • When I was in HS (70s), many of my richer friends went to GW and I was then of the impression that GW was a 'good' school. As I age, I have come to realize that this place is just another façade to the emptiness that has become America. All too often are we faced with a dilemma: damned if we do, damned if we don't. Yep, 'education' has become a trap for all too many of our citizen.
  • I transferred to GWU from a state school. I am forever grateful that I did. I wanted to get a good rigorous education and go to one of the best International Affairs schools in the world. Even though the state school I went to was dirt-cheap, the education and the faculty was awful. I transferred to GW and was amazed at the professors at that university. An ambassador or a prominent IA scholar taught every class. GW is an expensive school, but that is the free market. If you want a good education you need to be willing to pay for it or join the military. I did the latter and my school was completely free with no debt and I received an amazing education. If young people aren't willing to make some sort of sacrifice to get ahead or just expect everything to be given to then our country is in a sad state.We need to stop blaming universities like GWU that strive to attract better students, better professors, and better infrastructure. They are doing what is expected in America, to better oneself.
  • "Whether the students are actually learning anything is up to them, I guess." How could it possibly be otherwise??? I am glad that you are willing to give credit to teachers and administrators, but it is not they who "do the actual work of educating." From this fallacy comes its corollary, that we should blame teachers first for "under-performing schools". This long-running show of scapegoating may suit the wallets and vanity of American parents, but it is utterly senseless. When, if ever, American culture stops reeking of arrogance, greed and anti-intellectualism, things may improve, and we may resume the habit of bothering to learn. Until then, nothing doing.
  • Universities sell knowledge and grade students on how much they have learned. Fundamentally, there is conflict of interest in thsi setup. Moreover, students who are poorly educated, even if they know this, will not criticize their school, because doing so would make it harder for them to have a career. As such, many problems with higher education remain unexposed to the public.
  • I've lectured and taught in at least five different countries in three continents and the shortest perusal of what goes on abroad would totally undermine most of these speculations. For one thing American universities are unique in their dedication to a broad based liberal arts type education. In France, Italy or Germany, for example, you select a major like mathematics or physics and then in your four years you will not take even one course in another subject. The amount of work that you do that is critically evaluated by an instructor is a tiny fraction of what is done in an American University. While half educated critics based on profoundly incomplete research write criticism like this Universities in Germany Italy, the Netherlands, South Korea and Japan as well as France have appointed committees and made studies to explain why the American system of higher education so drastically outperforms their own system. Elsewhere students do get a rather nice dose of general education but it ends in secondary school and it has the narrowness and formulaic quality that we would just normally associate with that. The character who wrote this article probably never set foot on a "campus" of the University of Paris or Rome
  • The university is part of a complex economic system and it is responding to the demands of that system. For example, students and parents choose universities that have beautiful campuses and buildings. So universities build beautiful campuses. State support of universities has greatly declined, and this decline in funding is the greatest cause of increased tuition. Therefore universities must compete for dollars and must build to attract students and parents. Also, universities are not ranked based on how they educate students -- that's difficult to measure so it is not measured. Instead universities are ranked on research publications. So while universities certainly put much effort into teaching, research has to have a priority in order for the university to survive. Also universities do not force students and parents to attend high price institutions. Reasonably priced state institutions and community colleges are available to every student. Community colleges have an advantage because they are funded by property taxes. Finally learning requires good teaching, but it also requires students that come to the university funded, prepared, and engaged. This often does not happen. Conclusion- universities have to participate in profile raising actions in order to survive. The day that funding is provided for college, ranking is based on education, and students choose campuses with simple buildings, then things will change at the university.
  • This is the inevitable result of privatizing higher education. In the not-so-distant past, we paid for great state universities through our taxes, not tuition. Then, the states shifted funding to prisons and the Federal government radically cut research support and the GI bill. Instead, today we expect universities to support themselves through tuition, and to the extent that we offered students support, it is through non-dischargeable loans. To make matters worse, the interest rates on those loans are far above the government's cost of funds -- so in effect the loans are an excise tax on education (most of which is used to support a handful of for-profit institutions that account for the most student defaults). This "consumer sovereignty" privatized model of funding education works no better than privatizing California's electrical system did in the era of Enron, or our privatized funding of medical service, or our increasingly privatized prison system: it drives up costs at the same time that it replace quality with marketing.
  • There are data in some instances on student learning, but the deeper problem, as I suspect the author already knows, is that there is nothing like a consensus on how to measure that learning, or even on when is the proper end point to emphasize (a lot of what I teach -- I know this from what students have told me -- tends to come into sharp focus years after graduation).
  • Michael (Baltimore) has hit the nail on the head. Universities are increasingly corporatized institutions in the credentialing business. Knowledge, for those few who care about it (often not those paying for the credentials) is available freely because there's no profit in it. Like many corporate entities, it is increasingly run by increasingly highly paid administrators, not faculty.
  • GWU has not defined itself in any unique way, it has merely embraced the bland, but very expensive, accoutrements of American private education: luxury dorms, food courts, spa-like gyms, endless extracurricular activities, etc. But the real culprit for this bloat that students have to bear financially is the college ranking system by US News, Princeton Review, etc. An ultimately meaningless exercise in competition that has nevertheless pushed colleges and universities to be more like one another. A sad state of affairs, and an extremely expensive one for students
  • It is long past time to realize the failure of the Reagonomics-neoliberal private profits over public good program. In education, we need to return to public institutions publicly funded. Just as we need to recognize that Medicare, Social Security, the post office, public utilities, fire departments, interstate highway system, Veterans Administration hospitals and the GI bill are models to be improved and expanded, not destroyed.
  • George Washington is actually not a Rolex watch, it is a counterfeit Rolex. The real Rolexes of higher education -- places like Hopkins, Georgetown, Duke, the Ivies etc. -- have real endowments and real financial aid. No middle class kid is required to borrow $100,000 to get a degree from those schools, because they offer generous need-based financial aid in the form of grants, not loans. The tuition at the real Rolexes is really a sticker price that only the wealthy pay -- everybody else on a sliding scale. For middle class kids who are fortunate enough to get in, Penn actually ends up costing considerably less than a state university.The fake Rolexes -- BU, NYU, Drexel in Philadelphia -- don't have the sliding scale. They bury middle class students in debt.And really, though it is foolish to borrow $100,000 or $120,000 for an undergraduate degree, I don't find the transaction morally wrong. What is morally wrong is our federal government making that loan non-dischargeable in bankruptcy, so many if these kids will be having their wages garnished for the REST OF THEIR LIVES.There is a very simple solution to this, by the way. Cap the amount of non-dischargeable student loan debt at, say, $50,000
  • The slant of this article is critical of the growth of research universities. Couldn't disagree more. Modern research universities create are incredibly engines of economic opportunity not only for the students (who pay the bills) but also for the community via the creation of blue and white collar jobs. Large research university employ tens of thousands of locals from custodial and food service workers right up to high level administrators and specialist in finance, computer services, buildings and facilities management, etc. Johns Hopkins University and the University of Maryland system employ more people than any other industry in Maryland -- including the government. Research universities typically have hospitals providing cutting-edge medical care to the community. Local business (from cafes to property rental companies) benefit from a built-in, long-term client base as well as an educated workforce. And of course they are the foundry of new knowledge which is critical for the future growth of our country.Check out the work of famed economist Dr. Julia Lane on modeling the economic value of the research university. In a nutshell, there are few better investments America can make in herself than research universities. We are the envy of the world in that regard -- and with good reason. How many *industries* (let alone jobs) have Stanford University alone catalyzed?
  • What universities have the monopoly on is the credential. Anyone can learn, from books, from free lectures on the internet, from this newspaper, etc. But only universities can endow you with the cherished degree. For some reason, people are will to pay more for one of these pieces of paper with a certain name on it -- Ivy League, Stanford, even GW -- than another -- Generic State U -- though there is no evidence one is actually worth more in the marketplace of reality than the other. But, by the laws of economics, these places are actually underpriced: after all, something like 20 times more people are trying to buy a Harvard education than are allowed to purchase one. Usually that means you raise your price.
  • Overalll a good article, except for - "This comes on the heels of Richard Arum and Josipa Roksa’s “Academically Adrift,” a study that found “limited or no learning” among many college students." The measure of learning you report was a general thinking skills exam. That's not a good measure of college gains. Most psychologists and cognitive scientists worth their salt would tell you that improvement in critical thinking skills is going to be limited to specific areas. In other words, learning critical thinking skills in math will make little change in critical thinking about political science or biology. Thus we should not expect huge improvements in general critical thinking skills, but rather improvements in a student's major and other areas of focus, such as a minor. Although who has time for a minor when it is universally acknowledged that the purpose of a university is to please and profit an employer or, if one is lucky, an investor. Finally, improved critical thinking skills are not the end all and be all of a college education even given this profit centered perspective. Learning and mastering the cumulative knowledge of past generations is arguably the most important thing to be gained, and most universities still tend to excel at that even with the increasing mandate to run education like a business and cultivate and cull the college "consumer".
  • As for community colleges, there was an article in the Times several years ago that said it much better than I could have said it myself: community colleges are places where dreams are put on hold. Without making the full commitment to study, without leaving the home environment, many, if not most, community college students are caught betwixt and between, trying to balance work responsibilities, caring for a young child or baby and attending classes. For males, the classic "end of the road" in community college is to get a car, a job and a girlfriend, one who is not in college, and that is the end of the dream. Some can make it, but most cannot.
  • as a scientist I disagree with the claim that undergrad tuition subsidizes basic research. Nearly all lab equipment and research personnel (grad students, technicians, anyone with the title "research scientist" or similar) on campus is paid for through federal grants. Professors often spend all their time outside teaching and administration writing grant proposals, as the limited federal grant funds mean ~%85 of proposals must be rejected. What is more, out of each successful grant the university levies a "tax", called "overhead", of 30-40%, nominally to pay for basic operations (utilities, office space, administrators). So in fact one might say research helps fund the university rather than the other way around. Flag
  • It's certainly overrated as a research and graduate level university. Whether it is good for getting an undergraduate education is unclear, but a big part of the appeal is getting to live in D.C..while attending college instead of living in some small college town in the corn fields.
kushnerha

The Age of Protest - The New York Times - 0 views

  • If you go to The Guardian’s website these days you can find a section that is just labeled “Protest.” So now, with your morning coffee, you can get your news, weather, sports — and protests.
  • In my view, this age of protest is driven, in part, by the fact that the three largest forces on the planet — globalization, Moore’s law and Mother Nature — are all in acceleration, creating an engine of disruption that is stressing strong countries and middle classes and blowing up weak ones, while superempowering individuals and transforming the nature of work, leadership and government all at once.
  • When you get that much agitation in a world where everyone with a smartphone is now a reporter, news photographer and documentary filmmaker, it’s a wonder that every newspaper doesn’t have a “Protest” section.
  • ...8 more annotations...
  • “People everywhere seem to be morally aroused,” said Seidman. “The philosopher David Hume argued that ‘the moral imagination diminishes with distance.’ It would follow that the opposite is also true: As distance decreases, the moral imagination increases. Now that we have no distance — it’s like we’re all in a crowded theater, making everything personal — we are experiencing the aspirations, hopes, frustrations, plights of others in direct and visceral ways.”
  • “A dentist from Minnesota shoots a cherished lion in Zimbabwe named Cecil, and days later everyone in the world knows about it, triggering a tsunami of moral outrage on Twitter and Facebook. As a result, some people try to shut down his dental practice by posting negative reviews on Yelp and spray paint ‘Lion Killer’ on his Florida vacation home. Almost 400,000 people then sign a petition in one day on Change.org demanding that Delta Air Lines change their policy of transporting trophy kills. Delta does so and other airlines follow. And then hunters who contribute to Zimbabwe’s tourism industry protest the protest, claiming that they were being discriminated against.”
  • That we are becoming more morally aroused “is generally a good thing,” argued Seidman. Institutionalized racism in police departments, or in college fraternities, is real and had been tolerated for way too long. That it’s being called out is a sign of a society’s health “and re-engagement.”
  • But when moral arousal manifests as moral outrage, he added, “it can either inspire or repress a serious conversation or the truth.”
  • “If moral outrage, as justified as it may be, is followed immediately by demands for firings or resignations,” argued Seidman, “it can result in a vicious cycle of moral outrage being met with equal outrage, as opposed to a virtuous cycle of dialogue and the hard work of forging real understanding and enduring agreements.”
  • Furthermore, “when moral outrage skips over moral conversation, then the outcome is likely going to be acquiescence, not inspired solutions,” Seidman added. It can also feed the current epidemic of inauthentic apologies, “since apologies extracted under pressure are like telling a child, `Just say you’re sorry,’ to move past the issue without ever making amends.”
  • it’s as if “we’re living in a never-ending storm,” he said. Alas, though, resolving moral disputes “requires perspective, fuller context and the ability to make meaningful distinctions.”
  • requires leaders with the courage and empathy “to inspire people to pause to reflect, so that instead of reacting by yelling in 140 characters they can channel all this moral outrage into deep and honest conversations.”
Javier E

Living Another Day, Thanks to Grandparents Who Couldn't Sleep - The New York Times - 1 views

  • A new study, published Tuesday in Proceedings of the Royal Society B, suggests that the way sleep patterns change with age may be an evolutionary adaptation that helped our ancestors survive the night by ensuring one person in a community was awake at all times. The researchers called this phenomenon the “poorly sleeping grandparent hypothesis,” suggesting that an older member of a community who woke before dawn might have been crucial to spotting the threat of a hungry predator while younger people were still asleep. It may explain why people slept in mixed-age groups through much of human history.
  • The Hadza sleeping environment may have similarities to that of earlier humans, researchers said. They sleep outdoors or in grass huts in groups of 20 to 30 people without artificially regulating temperature or light. These conditions provide a suitable window to study the evolutionary aspects of sleep.
  • more than 220 total hours of sleep observation, researchers found only 18 minutes when all adults were sound asleep simultaneously. Typically, older participants in their 50s and 60s went to bed earlier and woke up earlier than those in their 20s and 30s. On average, more than a third of the group was alert, or lightly dozing, at any given time.
  • ...3 more annotations...
  • “We have a propensity to overcategorize things as disorders in the West,” said David Samson, an author of the study and an assistant professor of anthropology at the University of Toronto. “It might help elderly individuals to know changes they’re experiencing have an evolutionary reason.”
  • “The variation may be partially explained by genetics,” she said, “but there are environmental conditions too.” As people age, their social needs and level of activity change, potentially affecting their sleep patterns.
  • there is evidence of a genetic link, she added, pointing out that sleep quality declined among the older Hadza even while they remained active hunters and gatherers.
Javier E

History News Network | Just How Stupid Are We? Facing the Truth About Donald Trump's Am... - 1 views

  •  Just How Stupid Are We?  Facing the Truth About the American Voter.  The book is filled with statistics like these:● A majority of Americans don’t know which party is in control of Congress.  ● A majority can’t name the chief justice of the Supreme Court.  ● A majority don’t know we have three branches of government.
  • suddenly mainstream media pundits have discovered how ignorant millions of voters are.  See this and this and this and this.  More importantly, the concern with low-information voters has become widespread.  Many are now wondering what country they’re living in. 
  • The answer science gives us (the title of my last book and this essay notwithstanding) is not that people fall for slick charlatans like Trump because they’re stupid.
  • ...19 more annotations...
  •  The problem is that we humans didn’t evolve to live in the world in which we find ourselves.  As the social scientists Leda Cosmides and John Tooby put it, the human mind was “designed to solve the day-to-day problems of our hunter-gatherer ancestors. These stone age priorities produced a brain far better at solving some problems than others.” 
  • there are four failings common to human beings as a result of our Stone-Age brain that hinder us in politics.
  • why are we this way?  Science suggests that one reason is that we evolved to win in social settings and in such situations the truth doesn't matter as much as sheer doggedness
  • Second, we find it hard to size up politicians correctly.  The reason for this is that we rely on instant impressions. 
  • This stops voters from worrying that they need to bolster their impressions by consulting experts and reading news stories from a broad array of ideological viewpoints.  Why study when you can rely on your gut instinct?
  • Third, we aren’t inclined to reward politicians who tell us hard truths.
  • First, most people find it easy to ignore politics because it usually involves people they don’t know.  As human beings we evolved to care about people in our immediate vicinity.  Our nervous system kicks into action usually only when we meet people face-to-face
  •  This has left millions of voters on their own.  Lacking information, millions do what you would expect.  They go with their gut
  • most of the time we return to a state of well-being by simply ignoring the evidence we find discomforting.  This is known as Disconfirmation Bias and it afflicts all of us
  • Fourth, we frequently fail to show empathy in circumstances that clearly cry out for it.
  • We evolved to show empathy for people we know.  It takes special effort to empathize with people who don’t dress like us or look like us.
  • long-term we need to teach voters not to trust their instincts in politics because our instincts often don’t work.
  • Doing politics in a modern mass democracy, in other words, is an unnatural act.
  • Teaching this lesson doesn’t sound like a job for historians, but in one way it is.  Studying history is all about putting events into context. And as it turns out, voters need to learn the importance of context.
  • Given the mismatch between our Stone-Age brain and the problems we face in the 21st century, we should only trust our political instincts when those instincts are serviceable in a modern context.  If they aren’t (and most of the time they aren't), then higher order cognitive thinking is required.
  • Just why mass ignorance seems to be afflicting our politics at this moment is a complicated question.  But here again history can be helpful.  The answer seems to be that the institutions voters formerly could turn to for help have withered.
  • We don't want the truth to prevail, as Harvard's Steven Pinker informs us, we want our version of the truth to prevail, for in the end what we're really concerned with is maintaining our status or enhancing it.
  • ut cultural norms can be established that help us overcome our natural inclinations.
  • don’t have much confidence that people in general will be willing on their own to undertake the effort.
Javier E

Resist the Internet - The New York Times - 0 views

  • Definitely if you’re young, increasingly if you’re old, your day-to-day, minute-to-minute existence is dominated by a compulsion to check email and Twitter and Facebook and Instagram with a frequency that bears no relationship to any communicative need.
  • it requires you to focus intensely, furiously, and constantly on the ephemera that fills a tiny little screen, and experience the traditional graces of existence — your spouse and friends and children, the natural world, good food and great art — in a state of perpetual distraction.
  • Used within reasonable limits, of course, these devices also offer us new graces. But we are not using them within reasonable limits.
  • ...9 more annotations...
  • They are the masters; we are not. They are built to addict us, as the social psychologist Adam Alter’s new book “Irresistible” points out — and to madden us, distract us, arouse us and deceive us.
  • We primp and perform for them as for a lover; we surrender our privacy to their demands; we wait on tenterhooks for every “like.” The smartphone is in the saddle, and it rides mankind.
  • the internet, like alcohol, may be an example of a technology that should be sensibly restricted in custom and in law.
  • It certainly delivers some social benefits, some intellectual advantages, and contributes an important share to recent economic growth.
  • there are also excellent reasons to think that online life breeds narcissism, alienation and depression, that it’s an opiate for the lower classes and an insanity-inducing influence on the politically-engaged, and that it takes more than it gives from creativity and deep thought. Meanwhile the age of the internet has been, thus far, an era of bubbles, stagnation and democratic decay — hardly a golden age whose customs must be left inviolate.
  • So a digital temperance movement would start by resisting the wiring of everything, and seek to create more spaces in which internet use is illegal, discouraged or taboo. Toughen laws against cellphone use in cars, keep computers out of college lecture halls, put special “phone boxes” in restaurants where patrons would be expected to deposit their devices, confiscate smartphones being used in museums and libraries and cathedrals, create corporate norms that strongly discourage checking email in a meeting.
  • Then there are the starker steps. Get computers — all of them — out of elementary schools, where there is no good evidence that they improve learning. Let kids learn from books for years before they’re asked to go online for research; let them play in the real before they’re enveloped by the virtual
  • The age of consent should be 16, not 13, for Facebook accounts. Kids under 16 shouldn’t be allowed on gaming networks. High school students shouldn’t bring smartphones to school. Kids under 13 shouldn’t have them at all.
  • I suspect that versions of these ideas will be embraced within my lifetime by a segment of the upper class and a certain kind of religious family. But the masses will still be addicted, and the technology itself will have evolved to hook and immerse — and alienate and sedate — more completely and efficiently.
Javier E

Have Smartphones Destroyed a Generation? - The Atlantic - 0 views

  • She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
  • The arrival of the smartphone has radically changed every aspect of teenagers’ lives, from the nature of their social interactions to their mental health. These changes have affected young people in every corner of the nation and in every type of household
  • Around 2012, I noticed abrupt shifts in teen behaviors and emotional states. The gentle slopes of the line graphs became steep mountains and sheer cliffs, and many of the distinctive characteristics of the Millennial generation began to disappear. In all my analyses of generational data—some reaching back to the 1930s—I had never seen anything like it.
  • ...54 more annotations...
  • the trends persisted, across several years and a series of national surveys. The changes weren’t just in degree, but in kind.
  • The biggest difference between the Millennials and their predecessors was in how they viewed the world; teens today differ from the Millennials not just in their views but in how they spend their time. The experiences they have every day are radically different from those of the generation that came of age just a few years before them.
  • it was exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.
  • theirs is a generation shaped by the smartphone and by the concomitant rise of social media. I call them iGen
  • Born between 1995 and 2012, members of this generation are growing up with smartphones, have an Instagram account before they start high school, and do not remember a time before the internet.
  • iGen’s oldest members were early adolescents when the iPhone was introduced, in 2007, and high-school students when the iPad entered the scene, in 2010. A 2017 survey of more than 5,000 American teens found that three out of four owned an iPhone.
  • . I had grown accustomed to line graphs of trends that looked like modest hills and valleys. Then I began studying Athena’s generation.
  • More comfortable in their bedrooms than in a car or at a party, today’s teens are physically safer than teens have ever been. They’re markedly less likely to get into a car accident and, having less of a taste for alcohol than their predecessors, are less susceptible to drinking’s attendant ills.
  • Psychologically, however, they are more vulnerable than Millennials were: Rates of teen depression and suicide have skyrocketed since 2011. It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.
  • the twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever. There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives—and making them seriously unhappy.
  • But the allure of independence, so powerful to previous generations, holds less sway over today’s teens, who are less likely to leave the house without their parents. The shift is stunning: 12th-graders in 2015 were going out less often than eighth-graders did as recently as 2009.
  • Today’s teens are also less likely to date. The initial stage of courtship, which Gen Xers called “liking” (as in “Ooh, he likes you!”), kids now call “talking”—an ironic choice for a generation that prefers texting to actual conversation. After two teens have “talked” for a while, they might start dating.
  • only about 56 percent of high-school seniors in 2015 went out on dates; for Boomers and Gen Xers, the number was about 85 percent.
  • The decline in dating tracks with a decline in sexual activity. The drop is the sharpest for ninth-graders, among whom the number of sexually active teens has been cut by almost 40 percent since 1991. The average teen now has had sex for the first time by the spring of 11th grade, a full year later than the average Gen Xer
  • The teen birth rate hit an all-time low in 2016, down 67 percent since its modern peak, in 1991.
  • Nearly all Boomer high-school students had their driver’s license by the spring of their senior year; more than one in four teens today still lack one at the end of high school.
  • In conversation after conversation, teens described getting their license as something to be nagged into by their parents—a notion that would have been unthinkable to previous generations.
  • In the late 1970s, 77 percent of high-school seniors worked for pay during the school year; by the mid-2010s, only 55 percent did. The number of eighth-graders who work for pay has been cut in half.
  • Beginning with Millennials and continuing with iGen, adolescence is contracting again—but only because its onset is being delayed. Across a range of behaviors—drinking, dating, spending time unsupervised— 18-year-olds now act more like 15-year-olds used to, and 15-year-olds more like 13-year-olds. Childhood now stretches well into high school.
  • In an information economy that rewards higher education more than early work history, parents may be inclined to encourage their kids to stay home and study rather than to get a part-time job. Teens, in turn, seem to be content with this homebody arrangement—not because they’re so studious, but because their social life is lived on their phone. They don’t need to leave home to spend time with their friends.
  • eighth-, 10th-, and 12th-graders in the 2010s actually spend less time on homework than Gen X teens did in the early 1990s.
  • The time that seniors spend on activities such as student clubs and sports and exercise has changed little in recent years. Combined with the decline in working for pay, this means iGen teens have more leisure time than Gen X teens did, not less.
  • So what are they doing with all that time? They are on their phone, in their room, alone and often distressed.
  • despite spending far more time under the same roof as their parents, today’s teens can hardly be said to be closer to their mothers and fathers than their predecessors were. “I’ve seen my friends with their families—they don’t talk to them,” Athena told me. “They just say ‘Okay, okay, whatever’ while they’re on their phones. They don’t pay attention to their family.” Like her peers, Athena is an expert at tuning out her parents so she can focus on her phone.
  • The number of teens who get together with their friends nearly every day dropped by more than 40 percent from 2000 to 2015; the decline has been especially steep recently.
  • Eighth-graders who are heavy users of social media increase their risk of depression by 27 percent, while those who play sports, go to religious services, or even do homework more than the average teen cut their risk significantly.
  • The roller rink, the basketball court, the town pool, the local necking spot—they’ve all been replaced by virtual spaces accessed through apps and the web.
  • The results could not be clearer: Teens who spend more time than average on screen activities are more likely to be unhappy, and those who spend more time than average on nonscreen activities are more likely to be happy.
  • There’s not a single exception. All screen activities are linked to less happiness, and all nonscreen activities are linked to more happiness
  • Eighth-graders who spend 10 or more hours a week on social media are 56 percent more likely to say they’re unhappy than those who devote less time to social media
  • If you were going to give advice for a happy adolescence based on this survey, it would be straightforward: Put down the phone, turn off the laptop, and do something—anything—that does not involve a screen
  • Social-networking sites like Facebook promise to connect us to friends. But the portrait of iGen teens emerging from the data is one of a lonely, dislocated generation. Teens who visit social-networking sites every day but see their friends in person less frequently are the most likely to agree with the statements “A lot of times I feel lonely,” “I often feel left out of things,” and “I often wish I had more good friends.” Teens’ feelings of loneliness spiked in 2013 and have remained high since.
  • This doesn’t always mean that, on an individual level, kids who spend more time online are lonelier than kids who spend less time online.
  • Teens who spend more time on social media also spend more time with their friends in person, on average—highly social teens are more social in both venues, and less social teens are less so.
  • The more time teens spend looking at screens, the more likely they are to report symptoms of depression.
  • It’s not only a matter of fewer kids partying; fewer kids are spending time simply hanging out
  • Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan. (That’s much more than the risk related to, say, watching TV.)
  • Since 2007, the homicide rate among teens has declined, but the suicide rate has increased. As teens have started spending less time together, they have become less likely to kill one another, and more likely to kill themselves. In 2011, for the first time in 24 years, the teen suicide rate was higher than the teen homicide rate.
  • For all their power to link kids day and night, social media also exacerbate the age-old teen concern about being left out.
  • Today’s teens may go to fewer parties and spend less time together in person, but when they do congregate, they document their hangouts relentlessly—on Snapchat, Instagram, Facebook. Those not invited to come along are keenly aware of it. Accordingly, the number of teens who feel left out has reached all-time highs across age groups.
  • Forty-eight percent more girls said they often felt left out in 2015 than in 2010, compared with 27 percent more boys. Girls use social media more often, giving them additional opportunities to feel excluded and lonely when they see their friends or classmates getting together without them.
  • Social media levy a psychic tax on the teen doing the posting as well, as she anxiously awaits the affirmation of comments and likes. When Athena posts pictures to Instagram, she told me, “I’m nervous about what people think and are going to say. It sometimes bugs me when I don’t get a certain amount of likes on a picture.”
  • Girls have also borne the brunt of the rise in depressive symptoms among today’s teens. Boys’ depressive symptoms increased by 21 percent from 2012 to 2015, while girls’ increased by 50 percent—more than twice as much
  • The rise in suicide, too, is more pronounced among girls. Although the rate increased for both sexes, three times as many 12-to-14-year-old girls killed themselves in 2015 as in 2007, compared with twice as many boys
  • Social media give middle- and high-school girls a platform on which to carry out the style of aggression they favor, ostracizing and excluding other girls around the clock.
  • I asked my undergraduate students at San Diego State University what they do with their phone while they sleep. Their answers were a profile in obsession. Nearly all slept with their phone, putting it under their pillow, on the mattress, or at the very least within arm’s reach of the bed. They checked social media right before they went to sleep, and reached for their phone as soon as they woke up in the morning
  • the smartphone is cutting into teens’ sleep: Many now sleep less than seven hours most nights. Sleep experts say that teens should get about nine hours of sleep a night; a teen who is getting less than seven hours a night is significantly sleep deprived
  • Fifty-seven percent more teens were sleep deprived in 2015 than in 1991. In just the four years from 2012 to 2015, 22 percent more teens failed to get seven hours of sleep.
  • Two national surveys show that teens who spend three or more hours a day on electronic devices are 28 percent more likely to get less than seven hours of sleep than those who spend fewer than three hours, and teens who visit social-media sites every day are 19 percent more likely to be sleep deprived.
  • Teens who read books and magazines more often than the average are actually slightly less likely to be sleep deprived—either reading lulls them to sleep, or they can put the book down at bedtime.
  • Sleep deprivation is linked to myriad issues, including compromised thinking and reasoning, susceptibility to illness, weight gain, and high blood pressure. It also affects mood: People who don’t sleep enough are prone to depression and anxiety.
  • correlations between depression and smartphone use are strong enough to suggest that more parents should be telling their kids to put down their phone.
  • What’s at stake isn’t just how kids experience adolescence. The constant presence of smartphones is likely to affect them well into adulthood. Among people who suffer an episode of depression, at least half become depressed again later in life. Adolescence is a key time for developing social skills; as teens spend less time with their friends face-to-face, they have fewer opportunities to practice them
  • Significant effects on both mental health and sleep time appear after two or more hours a day on electronic devices. The average teen spends about two and a half hours a day on electronic devices. Some mild boundary-setting could keep kids from falling into harmful habits.
Javier E

In Defense of Facts - The Atlantic - 1 views

  • over 13 years, he has published a series of anthologies—of the contemporary American essay, of the world essay, and now of the historical American essay—that misrepresents what the essay is and does, that falsifies its history, and that contains, among its numerous selections, very little one would reasonably classify within the genre. And all of this to wide attention and substantial acclaim
  • D’Agata’s rationale for his “new history,” to the extent that one can piece it together from the headnotes that preface each selection, goes something like this. The conventional essay, nonfiction as it is, is nothing more than a delivery system for facts. The genre, as a consequence, has suffered from a chronic lack of critical esteem, and thus of popular attention. The true essay, however, deals not in knowing but in “unknowing”: in uncertainty, imagination, rumination; in wandering and wondering; in openness and inconclusion
  • Every piece of this is false in one way or another.
  • ...31 more annotations...
  • There are genres whose principal business is fact—journalism, history, popular science—but the essay has never been one of them. If the form possesses a defining characteristic, it is that the essay makes an argument
  • That argument can rest on fact, but it can also rest on anecdote, or introspection, or cultural interpretation, or some combination of all these and more
  • what makes a personal essay an essay and not just an autobiographical narrative is precisely that it uses personal material to develop, however speculatively or intuitively, a larger conclusion.
  • Nonfiction is the source of the narcissistic injury that seems to drive him. “Nonfiction,” he suggests, is like saying “not art,” and if D’Agata, who has himself published several volumes of what he refers to as essays, desires a single thing above all, it is to be known as a maker of art.
  • D’Agata tells us that the term has been in use since about 1950. In fact, it was coined in 1867 by the staff of the Boston Public Library and entered widespread circulation after the turn of the 20th century. The concept’s birth and growth, in other words, did coincide with the rise of the novel to literary preeminence, and nonfiction did long carry an odor of disesteem. But that began to change at least as long ago as the 1960s, with the New Journalism and the “nonfiction novel.”
  • What we really seem to get in D’Agata’s trilogy, in other words, is a compendium of writing that the man himself just happens to like, or that he wants to appropriate as a lineage for his own work.
  • What it’s like is abysmal: partial to trivial formal experimentation, hackneyed artistic rebellion, opaque expressions of private meaning, and modish political posturing
  • If I bought a bag of chickpeas and opened it to find that it contained some chickpeas, some green peas, some pebbles, and some bits of goat poop, I would take it back to the store. And if the shopkeeper said, “Well, they’re ‘lyric’ chickpeas,” I would be entitled to say, “You should’ve told me that before I bought them.”
  • when he isn’t cooking quotes or otherwise fudging the record, he is simply indifferent to issues of factual accuracy, content to rely on a mixture of guesswork, hearsay, and his own rather faulty memory.
  • His rejoinders are more commonly a lot more hostile—not to mention juvenile (“Wow, Jim, your penis must be so much bigger than mine”), defensive, and in their overarching logic, deeply specious. He’s not a journalist, he insists; he’s an essayist. He isn’t dealing in anything as mundane as the facts; he’s dealing in “art, dickhead,” in “poetry,” and there are no rules in art.
  • D’Agata replies that there is something between history and fiction. “We all believe in emotional truths that could never hold water, but we still cling to them and insist on their relevance.” The “emotional truths” here, of course, are D’Agata’s, not Presley’s. If it feels right to say that tae kwon do was invented in ancient India (not modern Korea, as Fingal discovers it was), then that is when it was invented. The term for this is truthiness.
  • D’Agata clearly wants to have it both ways. He wants the imaginative freedom of fiction without relinquishing the credibility (and for some readers, the significance) of nonfiction. He has his fingers crossed, and he’s holding them behind his back. “John’s a different kind of writer,” an editor explains to Fingal early in the book. Indeed he is. But the word for such a writer isn’t essayist. It’s liar.
  • he point of all this nonsense, and a great deal more just like it, is to advance an argument about the essay and its history. The form, D’Agata’s story seems to go, was neglected during the long ages that worshiped “information” but slowly emerged during the 19th and 20th centuries as artists learned to defy convention and untrammel their imaginations, coming fully into its own over the past several decades with the dawning recognition of the illusory nature of knowledge.
  • Most delectable is when he speaks about “the essay’s traditional ‘five-paragraph’ form.” I almost fell off my chair when I got to that one. The five-paragraph essay—introduction, three body paragraphs, conclusion; stultifying, formulaic, repetitive—is the province of high-school English teachers. I have never met one outside of a classroom, and like any decent college writing instructor, I never failed to try to wean my students away from them. The five-paragraph essay isn’t an essay; it’s a paper.
  • When he refers to his selections as essays, he does more than falsify the essay as a genre. He also effaces all the genres that they do belong to: not only poetry, fiction, journalism, and travel, but, among his older choices, history, parable, satire, the sermon, and more—genres that possess their own particular traditions, conventions, and expectation
  • —by ignoring the actual contexts of his selections, and thus their actual intentions—D’Agata makes the familiar contemporary move of imposing his own conceits and concerns upon the past. That is how ethnography turns into “song,” Socrates into an essayist, and the whole of literary history into a single man’s “emotional truth.”
  • The history of the essay is indeed intertwined with “facts,” but in a very different way than D’Agata imagines. D’Agata’s mind is Manichaean. Facts bad, imagination good
  • What he fails to understand is that facts and the essay are not antagonists but siblings, offspring of the same historical moment
  • one needs to recognize that facts themselves have a history.
  • Facts are not just any sort of knowledge, such as also existed in the ancient and medieval worlds. A fact is a unit of information that has been established through uniquely modern methods
  • Fact, etymologically, means “something done”—that is, an act or deed
  • It was only in the 16th century—an age that saw the dawning of a new empirical spirit, one that would issue not only in modern science, but also in modern historiography, journalism, and scholarship—that the word began to signify our current sense of “real state of things.”
  • It was at this exact time, and in this exact spirit, that the essay was born. What distinguished Montaigne’s new form—his “essays” or attempts to discover and publish the truth about himself—was not that it was personal (precursors like Seneca also wrote personally), but that it was scrupulously investigative. Montaigne was conducting research into his soul, and he was determined to get it right.
  • His famous motto, Que sais-je?—“What do I know?”—was an expression not of radical doubt but of the kind of skepticism that fueled the modern revolution in knowledge.
  • It is no coincidence that the first English essayist, Galileo’s contemporary Francis Bacon, was also the first great theorist of science.
  • That knowledge is problematic—difficult to establish, labile once created, often imprecise and always subject to the limitations of the human mind—is not the discovery of postmodernism. It is a foundational insight of the age of science, of fact and information, itself.
  • The point is not that facts do not exist, but that they are unstable (and are becoming more so as the pace of science quickens). Knowledge is always an attempt. Every fact was established by an argument—by observation and interpretation—and is susceptible to being overturned by a different one
  • A fact, you might say, is nothing more than a frozen argument, the place where a given line of investigation has come temporarily to rest.
  • Sometimes those arguments are scientific papers. Sometimes they are news reports, which are arguments with everything except the conclusions left out (the legwork, the notes, the triangulation of sources—the research and the reasoning).
  • When it comes to essays, though, we don’t refer to those conclusions as facts. We refer to them as wisdom, or ideas
  • the essay draws its strength not from separating reason and imagination but from putting them in conversation. A good essay moves fluidly between thought and feeling. It subjects the personal to the rigors of the intellect and the discipline of external reality. The truths it finds are more than just emotional.
cvanderloo

Long COVID: who is at risk? - 0 views

  • But some people have long-lasting symptoms after their infection – this has been dubbed “long COVID”.
  • In defining who is at risk from long COVID and the mechanisms involved, we may reveal suitable treatments to be tried – or whether steps taken early in the course of the illness might ameliorate it.
  • Indeed, early analysis of self-reported data submitted through the COVID Symptom Study app suggests that 13% of people who experience COVID-19 symptoms have them for more than 28 days, while 4% have symptoms after more than 56 days.
  • ...7 more annotations...
  • Patients in this study had a mean age of 44 years, so were very much part of the young, working-age population. Only 18% had been hospitalised with COVID-19, meaning organ damage may occur even after a non-severe infection.
  • Another piece of early research (awaiting peer review) suggests that SARS-CoV-2 could also have a long-term impact on people’s organs.
  • Perhaps unsurprisingly, people with more severe disease initially – characterised by more than five symptoms – seem to be at increased risk of long COVID. Older age and being female also appear to be risk factors for having prolonged symptoms, as is having a higher body mass index.
  • Rather harder to explore is the symptom of fatigue. Another recent large-scale study has shown that this symptom is common after COVID-19 – occurring in more than half of cases – and appears unrelated to the severity of the early illness.
  • While men are at increased risk of severe infection, that women seem to be more affected by long COVID may reflect their different or changing hormone status.
  • Some symptoms of long COVID overlap with menopausal symptoms, and hormone replacement using medication may be one route to reducing the impact of symptoms.
  • What is clear, however, is that long-term symptoms after COVID-19 are common, and that research into the causes and treatments of long COVID will likely be needed long after the outbreak itself has subsided.
anonymous

How the World's Oldest Wooden Sculpture Is Reshaping Prehistory - The New York Times - 0 views

  • How the World’s Oldest Wooden Sculpture Is Reshaping Prehistory
  • At 12,500 years old, the Shigir Idol is by far the earliest known work of ritual art. Only decay has kept others from being found.
  • The world’s oldest known wooden sculpture — a nine-foot-tall totem pole thousands of years old — looms over a hushed chamber of an obscure Russian museum in the Ural Mountains, not far from the Siberian border
  • ...24 more annotations...
  • Shigir Idol
  • Dug out of a peat bog by gold miners in 1890, the relic, or what’s left of it, is carved from a great slab of freshly cut larch.
  • Scattered among the geometric patterns (zigzags, chevrons, herringbones) are eight human faces, each with slashes for eyes that peer not so benignly from the front and back planes.
  • “Whether it screams or shouts or sings, it projects authority, possibly malevolent authority. It’s not immediately a friend of yours, much less an ancient friend of yours.”
  • In archaeology, portable prehistoric sculpture is called “mobiliary art.”
  • The statue’s age was a matter of conjecture until 1997, when it was carbon-dated by Russian scientists to about 9,500 years old, an age that struck most scholars as fanciful.
  • The statue was more than twice as old as the Egyptian pyramids and Stonehenge, as well as, by many millenniums, the first known work of ritual art.
  • A new study that Dr. Terberger wrote with some of the same colleagues in Quaternary International, further skews our understanding of prehistory by pushing back the original date of the Shigir Idol by another 900 years, placing it in the context of the early art in Eurasia.
  • “During the period of rapid cooling from about 10,700 B.C. to 9,600 B.C. that we call the Younger Dryas, no beavers should have been around in the Transurals,” he said.)
  • Written with an eye toward disentangling Western science from colonialism, Dr. Terberger’s latest paper challenges the ethnocentric notion that pretty much everything, including symbolic expression and philosophical perceptions of the world, came to Europe by way of the sedentary farming communities in the Fertile Crescent 8,000 years ago.
  • “It’s similar to the ‘Neanderthals did not make art’ fable, which was entirely based on absence of evidence,
  • Likewise, the overwhelming scientific consensus used to hold that modern humans were superior in key ways, including their ability to innovate, communicate and adapt to different environments.
  • Nonsense, all of it.”
  • makes it clear that arguments about the wealth of mobiliary art in, say, the Upper Paleolithic of Germany or France by comparison to southern Europe, are largely nonsensical and an artifact of tundra (where there are no trees and you use ivory, which is archaeologically visible) versus open forest environments
  • The Shigir Idol, named for the bog near Kirovgrad in which it was found, is presumed to have rested on a rock base for perhaps two or three decades before toppling into a long-gone paleo-lake, where the peat’s antimicrobial properties protected it like a time capsule.
  • “It was not a scientific construction,”
  • “The rings tell us that trees were growing very slowly, as the temperature was still quite cold,”
  • Dr. Terberger respectfully disagrees.
  • “The landscape changed, and the art — figurative designs and naturalistic animals painted in caves and carved in rock — did, too, perhaps as a way to help people come to grips with the challenging environments they encountered.”
  • And what do the engravings mean? Svetlana Savchenko, the artifact’s curator and an author on the study, speculates that the eight faces may well contain encrypted information about ancestor spirits, the boundary between earth and sky, or a creation myth.
  • The temple’s stones were carved around 11,000 years ago, which makes them 1,500 years younger than the Shigir Idol.
  • One could wonder how many similar pieces have been lost over time due to poor preservation conditions.”
  • The similarity of the geometric motifs to others across Europe in that era, he added, “is evidence of long-distance contacts and a shared sign language over vast areas. The sheer size of the idol also seems to indicate it was meant as a marker in the landscape that was supposed to be seen by other hunter-gatherer groups — perhaps marking the border of a territory, a warning or welcoming sign.”
  • “What do you think is the hardest thing to find in the Stone Age archaeology of the Urals?”A pause: Sites?“No,” he said, sighing softly. “Funding.”
cvanderloo

Vaccine Eligibility In Many States Expanding To Include All Adults : Coronavirus Update... - 1 views

  • Nearly half of U.S. states will have opened COVID-19 vaccinations to all adults by April 15, officials said Friday, putting them weeks ahead of the May 1 deadline that President Biden announced earlier this month.
  • Jeff Zients, Biden's COVID-19 czar, said that 46 states and Washington, D.C., have announced plans to expand eligibility to all adults by May 1.
  • "It's clear there is a case for optimism, but there is not a case for relaxation," Zients said. "This is not the time to let down our guard. We need to follow the public health guidance, wear a mask, socially distance and get a vaccine when it's your turn."
  • ...6 more annotations...
  • Alaska became the first state to make vaccinations available to all adults over the age of 16 earlier this month, followed by Mississippi. Several others have since followed suit, including Arizona, Utah, Indiana, Georgia and West Virginia.
  • Other states are moving to make more groups eligible ahead of schedule, based on age or underlying conditions.
  • According to a map released by the White House COVID-19 Response Team on Friday, four states have yet to confirm plans to expand eligibility ahead of the May 1 deadline: New York, Wyoming, Arkansas and South Carolina, where officials have said they are not on track to hit that threshold until May 3.
  • Dr. Rochelle Walensky, director of the Centers for Disease Control and Prevention, said at the briefing that the country has seen an uptick in case counts and hospital admissions, with the most recent 7-day averages showing about 57,000 cases and 4,700 hospitalizations per day, and hospitalizations hovering around 1,000.
  • The U.S. is administering 2.5 million shots a day at its current pace, Zients said, adding that vaccine makers are "setting and hitting targets." Some 27 million doses went to states, tribes and territories this week.
  • Johnson & Johnson has accelerated production of its single-shot vaccine and is on track to deliver 11 million doses next week.
« First ‹ Previous 41 - 60 of 562 Next › Last »
Showing 20 items per page