Skip to main content

Home/ TOK Friends/ Group items tagged reader

Rss Feed Group items tagged

blythewallick

Can Artificial Intelligence Be Creative? | JSTOR Daily - 0 views

  • Machines can write compelling ad copy and solve complex “real life” problems. Should the creative class be worried?
  • Rich breaks down some “abstract” problems into their fundamental parts and shows how, with comprehensive enough data and well-structured enough logical programming, AI could be suited to tackle creative problems. In one case, she offers a “real world” example about a manufacturing company’s new line of products and their plans, goals, and expectations for marketing the new line to a specific city. Weekly Newsletter Get your fix of JSTOR Daily’s best stories in your inbox each Thursday. Privacy Policy   Contact Us You may unsubscribe at any time by clicking on the provided link on any marketing message.
  • almost all problems rely (or ought to rely) on an understanding of the nature of both knowledge and reasoning. Humanists are trying to solve many of these same problems. Thus there is room for a good deal of interaction between artificial intelligence and many disciplines within the humanities.
  • ...1 more annotation...
  • There is much to be said, however, for art’s ability to evoke emotion based on common experience, sincerity, talent, and unique skill. Rich proves that AI can be used to answer complicated questions. But what we think of as creative work in the humanities is much more often about asking questions than it is about answering them.
sanderk

What Makes a Good Scientist? | American Scientist - 0 views

  • Those looking for practical advice about the job market would be better served seeking counsel from those who have more recently experienced it. Instead, he builds his book around a more fundamental and timeless career question: What makes a good scientist?
  • he encourages the reader to daydream, work hard, and mess around. Wilson relates several examples of “messing around” in his own career. Some led to major discoveries—for example, developing a new “chilling and mixing” method for swapping ant queens of different species to determine whether trait differences are genetically determined.
  • Most young scientists are not prepared for the level and number of setbacks they are likely to encounter in their early careers. As scientific funding has been cut and tenure-track jobs have grown scarce, today’s young scientists will be rejected more than any scientists of equal caliber in the past century.
  • ...2 more annotations...
  • Graduate students will inevitably encounter failed experiments, failed teachable moments in the classroom, and time spent on research that never gets funded or never comes to fruition. Wilson counsels patience: “A strong work ethic is absolutely essential. There must be an ability to pass long hours in study and research with pleasure even though some of the effort will inevitably lead to dead ends.”
  • Scientists, he claims, tend to be introverted and prone to daydreaming. They reject authority and therefore dislike being told what to do. Their attention wanders. This pigeonholing made me uncomfortable, because I think science benefits from incorporating a diversity of personalities
Javier E

The Adams Principle ❧ Current Affairs - 0 views

  • This type of glib quasi-logic works really well in comedy, especially in a format where space is restricted, and where the quick, disposable nature of the strip limits your ability to draw humor from character and plot. You take an idea, find a way to subvert or deconstruct it, and you get an absurd result.
  • while the idea of a “cubicle job” can seem to younger readers like relative bliss, they were (and are) still an emblem of boredom and absurdity, a sign that life was being slowly colonized by gray shapes and Powerpoint slides. Throughout his classic-era work, Adams hits on the feeling that the world has been made unnatural, unconducive to life; materially adequate, but spiritually exhausting. 
  • He makes constant use of something I’m going to call, for want of a better term, the sophoid: something which has the outer semblance of wisdom, but none of the substance; something that sounds weighty if you say it confidently enough, yet can be easily thrown away as “just a thought” if it won’t hold up to scrutiny.
  • ...10 more annotations...
  • Adams did not just stick to comics: he is the author of over a dozen books (not counting the comic compendiums), which advise and analyze not only on surviving the office but also on daily life, future technology trends, romance, self-help strategy, and more. 
  • In his earlier books, you can feel the weight of the 1990s pressing down on his work, flattening and numbing its potency; this was the period that social scientist Francis Fukuyama dubbed “the end of history”, when the Cold War had ended, the West had won, 9/11 was just two numbers, and there were no grand missions left, no worlds left to conquer. While for millions of people, both in the United States and abroad, life was still chaotic and miserable, a lot of people found themselves living lives that were under no great immediate threat: without bombs or fascism or the threat of eviction to worry about, there was nothing left to do but to go to the office and enjoy fast-casual dining and Big Gulps, just as the Founding Fathers envisioned.
  • This dull but steady life produced a sense of slow-burn anxiety prominent in much of the pop culture of the time, as can be seen in movies such as Office Space, Fight Club and The Matrix, movies which cooed to their audience: there’s got to be more to life than this, right?
  • In Dilbert the Pointy-haired Boss uses this type of thinking to evil ends, in the tradition of Catch-22 and other satires of systemic brutality, but the relatable characters use it to their advantage too—by using intellectual sleight of hand with the boss to justify doing less work, or by finding clever ways to look busy when they’re not, or to avoid people who are unpleasant to be around.
  • for someone who satirizes business bullshit, Adams is a person who seems to have bought into much of it wholeheartedly; when he explains his approach to life he tends to speak in LinkedIn truisms, expounding on his “skill stacks” and “maximizing [his] personal energy”. (You can read more about this in his career advice book, How to Fail at Almost Everything and Still Win Big;
  • Following his non-Dilbert career more carefully, you can see that at every stage of his career, he’s actually quite heavily invested in the bullshit he makes fun of every day, or at least some aspects of it: he possesses an MBA from UC Berkeley, and has launched or otherwise been involved in a significant number of business ventures, most amusingly a health food wrap called the “Dilberito”.
  • In the past few years, Adams has gained some notoriety as a Trump supporter; having slowly moved from “vaguely all-over-the-place centrist who has some odd thoughts and thinks some aspects of Trump are impressive” to full-on MAGA guy, even writing a book called Win Bigly praising Trump’s abilities as a “master persuader”.
  • this is a guy who hates drab corporatespeak but loves the ideology behind it, a guy who describes the vast powerlessness of life but believes you can change it by writing some words on a napkin. That blend of rebellion against the symptoms of post-Cold War society and sworn allegiance to its machinations couldn’t lead anywhere else but to Trump, a man who rails against ‘elites’ while allowing them to run the country into the ground.
  • Beware: as I’m pretty sure Nietzsche said, when you gaze into Dilbert, eventually Dilbert gazes back into you.
  • I just think Adams is a guy who spent so long in the world of slick aphorisms and comic-strip logic that it eventually ate into his brain, became his entire manner of thinking
Javier E

Don't Be Surprised About Facebook and Teen Girls. That's What Facebook Is. | Talking Po... - 0 views

  • First, set aside all morality. Let’s say we have a 16 year old girl who’s been doing searches about average weights, whether boys care if a girl is overweight and maybe some diets. She’s also spent some time on a site called AmIFat.com. Now I set you this task. You’re on the other side of the Facebook screen and I want you to get her to click on as many things as possible and spend as much time clicking or reading as possible. Are you going to show her movie reviews? Funny cat videos? Homework tips? Of course, not.
  • If you’re really trying to grab her attention you’re going to show her content about really thin girls, how their thinness has gotten them the attention of boys who turn out to really love them, and more diets
  • We both know what you’d do if you were operating within the goals and structure of the experiment.
  • ...17 more annotations...
  • This is what artificial intelligence and machine learning are. Facebook is a series of algorithms and goals aimed at maximizing engagement with Facebook. That’s why it’s worth hundreds of billions of dollars. It has a vast army of computer scientists and programmers whose job it is to make that machine more efficient.
  • the Facebook engine is designed to scope you out, take a psychographic profile of who you are and then use its data compiled from literally billions of humans to serve you content designed to maximize your engagement with Facebook.
  • Put in those terms, you barely have a chance.
  • Of course, Facebook can come in and say, this is damaging so we’re going to add some code that says don’t show this dieting/fat-shaming content but girls 18 and under. But the algorithms will find other vulnerabilities
  • So what to do? The decision of all the companies, if not all individuals, was just to lie. What else are you going to do? Say we’re closing down our multi-billion dollar company because our product shouldn’t exist?
  • why exactly are you creating a separate group of subroutines that yanks Facebook back when it does what it’s supposed to do particularly well? This, indeed, was how the internal dialog at Facebook developed, as described in the article I read. Basically, other executives said: Our business is engagement, why are we suggesting people log off for a while when they get particularly engaged?
  • what it makes me think about more is the conversations at Tobacco companies 40 or 50 years ago. At a certain point you realize: our product is bad. If used as intended it causes lung cancer, heart disease and various other ailments in a high proportion of the people who use the product. And our business model is based on the fact that the product is chemically addictive. Our product is getting people addicted to tobacco so that they no longer really have a choice over whether to buy it. And then a high proportion of them will die because we’ve succeeded.
  • . The algorithms can be taught to find and address an infinite numbers of behaviors. But really you’re asking the researchers and programmers to create an alternative set of instructions where Instagram (or Facebook, same difference) jumps in and does exactly the opposite of its core mission, which is to drive engagement
  • You can add filters and claim you’re not marketing to kids. But really you’re only ramping back the vast social harm marginally at best. That’s the product. It is what it is.
  • there is definitely an analogy inasmuch as what you’re talking about here aren’t some glitches in the Facebook system. These aren’t some weird unintended consequences that can be ironed out of the product. It’s also in most cases not bad actors within Facebook. It’s what the product is. The product is getting attention and engagement against which advertising is sold
  • How good is the machine learning? Well, trial and error with between 3 and 4 billion humans makes you pretty damn good. That’s the product. It is inherently destructive, though of course the bad outcomes aren’t distributed evenly throughout the human population.
  • The business model is to refine this engagement engine, getting more attention and engagement and selling ads against the engagement. Facebook gets that revenue and the digital roadkill created by the product gets absorbed by the society at large
  • Facebook is like a spectacularly profitable nuclear energy company which is so profitable because it doesn’t build any of the big safety domes and dumps all the radioactive waste into the local river.
  • in the various articles describing internal conversations at Facebook, the shrewder executives and researchers seem to get this. For the company if not every individual they seem to be following the tobacco companies’ lead.
  • Ed. Note: TPM Reader AS wrote in to say I was conflating Facebook and Instagram and sometimes referring to one or the other in a confusing way. This is a fair
  • I spoke of them as the same intentionally. In part I’m talking about Facebook’s corporate ownership. Both sites are owned and run by the same parent corporation and as we saw during yesterday’s outage they are deeply hardwired into each other.
  • the main reason I spoke of them in one breath is that they are fundamentally the same. AS points out that the issues with Instagram are distinct because Facebook has a much older demographic and Facebook is a predominantly visual medium. (Indeed, that’s why Facebook corporate is under such pressure to use Instagram to drive teen and young adult engagement.) But they are fundamentally the same: AI and machine learning to drive engagement. Same same. Just different permutations of the same dynamic.
Javier E

Wailing And Gnashing Of Teeth: Trumpers React To Draft 'Audit' Report Showing Biden Win... - 0 views

  • the audit failed: Not only did it count Biden’s victory, but even its attempts to sow doubts about its own findings and the official results are fairly weak and rehearsed. 
  • But for Trump supporters desperate to keep the fiction going — particularly those who’ve staked their political campaigns on the Big Lie — the show needed to go on. 
  • Responding to the disappointing report, they ignored the bad news and acted as if it had affirmed their prior assumptions. And, therefore: Audits, forever and always. 
  • ...2 more annotations...
  • “Now that the audit of Maricopa is wrapping up, we need to Audit Pima County – the 2nd largest county in AZ,” Mark Finchem, a member of the state legislature and the Trump-endorsed candidate for Arizona secretary of state tweeted. He urged readers to sign his “petition” for a Pima County audit — one that would give his campaign their personal information.
  • A state representative from Florida used the report to call for audits in every state in the country. 
Javier E

On the Shortness of Life 2.0 - by Peter Juul - The Liberal Patriot - 0 views

  • It’s a deft and eclectic synthesis of ancient and modern thinking about how humanity can come to terms with our limited time on Earth – the title derives from the length of the average human lifespan – ranging intellectually from ancient Greek and Roman philosophers like Seneca to modern-day Buddhist and existentialist thinkers. Stuffed with valuable and practical insights on life and how we use – or misuse – it, Four Thousand Weeks is an impressive and compact volume well worth the time and attention of even the most casual readers.
  • As Burkeman notes, our preoccupation with productivity allows us to evade “the anxiety that might arise if we were to ask ourselves whether we’re on the right path.” The end result is a lot of dedicated and talented people in politics and policy burning themselves out for no discernable or meaningful purpose.
  • Then there’s social media, defined by Burkeman as “a machine for misusing your life.” Social media platforms like Twitter and Facebook don’t just distract us from more important matters, he argues, “they change how we’re defining ‘important matters’ in the first place.”
  • ...15 more annotations...
  • Social media also amounts to “a machine for getting you to care about too many things, even if they’re each indisputably worthwhile.” Hence the urge to depict every policy problem as an urgent if not existential crisis
  • social media has turned all of us into “angrier, less empathetic, more anxious or more numbed out” versions of ourselves.
  • Finally, our political and policy debates tend towards what Burkeman calls “paralyzing grandiosity” – the false notion that in the face of problems like climate change, economic inequality, and ongoing threats to democracy “only the most revolutionary, world-transforming causes are worth fighting for.” It’s a sentiment that derives from and reinforces catastrophism and absolutism as ways of thinking about politics and policy
  • That sentiment also often results in impotent impatience, which in turn leads to frustration, anger, and cynicism when things don’t turn out exactly as we’ve hoped. But it also allows us to avoid hard choices required in order to pull together the political coalitions necessary to effect actual change.
  • Four Thousand Weeks is filled to the brim with practical advice
  • Embrace “radical incrementalism.”
  • Burkeman suggests we find some hobby we enjoy for its own sake, not because there’s some benefit we think we can derive from it
  • Take a break
  • rest for rest’s sake, “to spend some of our time, that is, on activities in which the only thing we’re trying to get from them is the doing itself.”
  • we should cultivate the patience to see our goals through step-by-step over the long term. We’ve got to resist the need for speed and desire for rapid resolution of problems, letting them instead take the time they take.
  • “To make a difference,” Burkeman argues, “you must focus your finite capacity for care.”
  • “Consolidate your caring” and think small.
  • it’s perfectly fine to dedicate your time to a limited subset of issues that you care deeply about. We’re only mortal, and as Burkeman points out it’s important to “consciously pick your battles in charity, activism, and politics.”
  • our lives are just as meaningful and worthwhile if we spend our time “on, say caring for an elderly relative with dementia or volunteering at the local community garden” as they are if we’re up to our eyeballs in the minutiae of politics and policy. What matters is that we make things slightly better with our contributions and actions
  • once we give up on the illusion of perfection, Burkeman observes, we “get to roll up [our] sleeves and start work on what’s gloriously possible instead.”
Javier E

Pandemic-Era Politics Are Ruining Public Education - The Atlantic - 0 views

  • You’re also the nonvoting, perhaps unwitting, subject of adults’ latest pedagogical experiments: either relentless test prep or test abolition; quasi-religious instruction in identity-based virtue and sin; a flood of state laws to keep various books out of your hands and ideas out of your head.
  • Your parents, looking over your shoulder at your education and not liking what they see, have started showing up at school-board meetings in a mortifying state of rage. If you live in Virginia, your governor has set up a hotline where they can rat out your teachers to the government. If you live in Florida, your governor wants your parents to sue your school if it ever makes you feel “discomfort” about who you are
  • Adults keep telling you the pandemic will never end, your education is being destroyed by ideologues, digital technology is poisoning your soul, democracy is collapsing, and the planet is dying—but they’re counting on you to fix everything when you grow up.
  • ...37 more annotations...
  • It isn’t clear how the American public-school system will survive the COVID years. Teachers, whose relative pay and status have been in decline for decades, are fleeing the field. In 2021, buckling under the stresses of the pandemic, nearly 1 million people quit jobs in public education, a 40 percent increase over the previous year.
  • These kids, and the investments that come with them, may never return—the beginning of a cycle of attrition that could continue long after the pandemic ends and leave public schools even more underfunded and dilapidated than before. “It’s an open question whether the public-school system will recover,” Steiner said. “That is a real concern for democratic education.”
  • The high-profile failings of public schools during the pandemic have become a political problem for Democrats, because of their association with unions, prolonged closures, and the pedagogy of social justice, which can become a form of indoctrination.
  • The party that stands for strong government services in the name of egalitarian principles supported the closing of schools far longer than either the science or the welfare of children justified, and it has been woefully slow to acknowledge how much this damaged the life chances of some of America’s most disadvantaged students.
  • Public education is too important to be left to politicians and ideologues. Public schools still serve about 90 percent of children across red and blue America.
  • Since the common-school movement in the early 19th century, the public school has had an exalted purpose in this country. It’s our core civic institution—not just because, ideally, it brings children of all backgrounds together in a classroom, but because it prepares them for the demands and privileges of democratic citizenship. Or at least, it needs to.
  • What is school for? This is the kind of foundational question that arises when a crisis shakes the public’s faith in an essential institution. “The original thinkers about public education were concerned almost to a point of paranoia about creating self-governing citizens,”
  • “Horace Mann went to his grave having never once uttered the phrase college- and career-ready. We’ve become more accustomed to thinking about the private ends of education. We’ve completely lost the habit of thinking about education as citizen-making.”
  • School can’t just be an economic sorting system. One reason we have a stake in the education of other people’s children is that they will grow up to be citizens.
  • Public education is meant not to mirror the unexamined values of a particular family or community, but to expose children to ways that other people, some of them long dead, think.
  • If the answer were simply to push more and more kids into college, the United States would be entering its democratic prime
  • So the question isn’t just how much education, but what kind. Is it quaint, or utopian, to talk about teaching our children to be capable of governing themselves?
  • The COVID era, with Donald Trump out of office but still in power and with battles over mask mandates and critical race theory convulsing Twitter and school-board meetings, shows how badly Americans are able to think about our collective problems—let alone read, listen, empathize, debate, reconsider, and persuade in the search for solutions.
  • democratic citizenship can, at least in part, be learned.
  • The history warriors build their metaphysics of national good or evil on a foundation of ignorance. In a 2019 survey, only 40 percent of Americans were able to pass the test that all applicants for U.S. citizenship must take, which asks questions like “Who did the United States fight in World War II?” and “We elect a President for how many years?” The only state in which a majority passed was Vermont.
  • he orthodoxies currently fighting for our children’s souls turn the teaching of U.S. history into a static and morally simple quest for some American essence. They proceed from celebration or indictment toward a final judgment—innocent or guilty—and bury either oppression or progress in a subordinate clause. The most depressing thing about this gloomy pedagogy of ideologies in service to fragile psyches is how much knowledge it takes away from students who already have so little
  • A central goal for history, social-studies, and civics instruction should be to give students something more solid than spoon-fed maxims—to help them engage with the past on its own terms, not use it as a weapon in the latest front of the culture wars.
  • Releasing them to do “research” in the vast ocean of the internet without maps and compasses, as often happens, guarantees that they will drown before they arrive anywhere.
  • The truth requires a grounding in historical facts, but facts are quickly forgotten without meaning and context
  • The goal isn’t just to teach students the origins of the Civil War, but to give them the ability to read closely, think critically, evaluate sources, corroborate accounts, and back up their claims with evidence from original documents.
  • This kind of instruction, which requires teachers to distinguish between exposure and indoctrination, isn’t easy; it asks them to be more sophisticated professionals than their shabby conditions and pay (median salary: $62,000, less than accountants and transit police) suggest we are willing to support.
  • To do that, we’ll need to help kids restore at least part of their crushed attention spans.
  • staring at a screen for hours is a heavy depressant, especially for teenagers.
  • we’ll look back on the amount of time we let our children spend online with the same horror that we now feel about earlier generations of adults who hooked their kids on smoking.
  • “It’s not a choice between tech or no tech,” Bill Tally, a researcher with the Education Development Center, told me. “The question is what tech infrastructure best enables the things we care about,” such as deep engagement with instructional materials, teachers, and other students.
  • The pandemic should have forced us to reassess what really matters in public school; instead, it’s a crisis that we’ve just about wasted.
  • Like learning to read as historians, learning to sift through the tidal flood of memes for useful, reliable information can emancipate children who have been heedlessly hooked on screens by the adults in their lives
  • Finally, let’s give children a chance to read books—good books. It’s a strange feature of all the recent pedagogical innovations that they’ve resulted in the gradual disappearance of literature from many classrooms.
  • The best way to interest young people in literature is to have them read good literature, and not just books that focus with grim piety on the contemporary social and psychological problems of teenagers.
  • We sell them insultingly short in thinking that they won’t read unless the subject is themselves. Mirrors are ultimately isolating; young readers also need windows, even if the view is unfamiliar, even if it’s disturbing
  • connection through language to universal human experience and thought is the reward of great literature, a source of empathy and wisdom.
  • The culture wars, with their atmosphere of resentment, fear, and petty faultfinding, are hostile to the writing and reading of literature.
  • W. E. B. Du Bois wrote: “Nations reel and stagger on their way; they make hideous mistakes; they commit frightful wrongs; they do great and beautiful things. And shall we not best guide humanity by telling the truth about all this, so far as the truth is ascertainable?”
  • The classroom has become a half-abandoned battlefield, where grown-ups who claim to be protecting students from the virus, from books, from ideologies and counter-ideologies end up using children to protect themselves and their own entrenched camps.
  • American democracy can’t afford another generation of adults who don’t know how to talk and listen and think. We owe our COVID-scarred children the means to free themselves from the failures of the past and the present.
  • Students are leaving as well. Since 2020, nearly 1.5 million children have been removed from public schools to attend private or charter schools or be homeschooled.
  • “COVID has encouraged poor parents to question the quality of public education. We are seeing diminished numbers of children in our public schools, particularly our urban public schools.” In New York, more than 80,000 children have disappeared from city schools; in Los Angeles, more than 26,000; in Chicago, more than 24,000.
criscimagnael

Writing History: An Introductory Guide to How History Is Produced | AHA - 0 views

  • In fact, history is NOT a "collection of facts about the past." History consists of making arguments about what happened in the past on the basis of what people recorded (in written documents, cultural artifacts, or oral traditions) at the time.
  • The problem is complicated for major events that produce "winners" and "losers," since we are more likely to have sources written by the "winners," designed to show why they were heroic in their victories.
  • There is no interpretation. There is no explanation of why the Mexicas lost.
  • ...7 more annotations...
  • Many individuals believe that history is about telling stories, but most historians also want answers to questions like why did the Mexicas lose?
  • Unfortunately, in the case of the conquest of Mexico, there is only one genuine primary source written from 1519-1521. This primary source consists of the letters Cortés wrote and sent to Spain.
  • Ideally, under each of these "thesis statements," that is, each of these arguments about why the Mexicas were defeated, the authors will give some examples of information that backs up their "thesis." To write effective history and history essays, in fact to write successfully in any area, you should begin your essay with the "thesis" or argument you want to prove with concrete examples that support your thesis.
  • To become a critical reader, to empower yourself to "own your own history," you should think carefully about whether the evidence the authors provide does in fact support their theses. 
  • Or, if your professor has said something in class that you are not sure about, find material to disprove it—the "trash the prof" approach (and, yes, it is really okay if you have the evidence).
  • If one analyzes omens in the conquest, will it change the theses or interpretations presented in the textbook?
  • One way to think about the primary sources is to ask the questions: (1) when was the source written, (2) who is the intended audience of the source, (3) what are the similarities between the accounts, (4) what are the differences between the accounts, (5) what pieces of information in the accounts will support your thesis, and (6) what information in the sources are totally irrelevant to the thesis or argument you want to make.
    • criscimagnael
       
      OPCVL!
Javier E

Opinion | What College Students Need Is a Taste of the Monk's Life - The New York Times - 0 views

  • When she registered last fall for the seminar known around campus as the monk class, she wasn’t sure what to expect.
  • “You give up technology, and you can’t talk for a month,” Ms. Rodriguez told me. “That’s all I’d heard. I didn’t know why.” What she found was a course that challenges students to rethink the purpose of education, especially at a time when machine learning is getting way more press than the human kind.
  • Each week, students would read about a different monastic tradition and adopt some of its practices. Later in the semester, they would observe a one-month vow of silence (except for discussions during Living Deliberately) and fast from technology, handing over their phones to him.
  • ...50 more annotations...
  • Yes, he knew they had other classes, jobs and extracurriculars; they could make arrangements to do that work silently and without a computer.
  • The class eased into the vow of silence, first restricting speech to 100 words a day. Other rules began on Day 1: no jewelry or makeup in class. Men and women sat separately and wore different “habits”: white shirts for the men, women in black. (Nonbinary and transgender students sat with the gender of their choice.)
  • Dr. McDaniel discouraged them from sharing personal information; they should get to know one another only through ideas. “He gave us new names, based on our birth time and day, using a Thai birth chart,”
  • “We were practicing living a monastic life. We had to wake up at 5 a.m. and journal every 30 minutes.”
  • If you tried to cruise to a C, you missed the point: “I realized the only way for me to get the most out of this class was to experience it all,” she said. (She got Dr. McDaniel’s permission to break her vow of silence in order to talk to patients during her clinical rotation.)
  • Dr. McDaniel also teaches a course called Existential Despair. Students meet once a week from 5 p.m. to midnight in a building with comfy couches, turn over their phones and curl up to read an assigned novel (cover to cover) in one sitting — books like James Baldwin’s “Giovanni’s Room” and José Saramago’s “Blindness.” Then they stay up late discussing it.
  • The course is not about hope, overcoming things, heroic stories,” Dr. McDaniel said. Many of the books “start sad. In the middle they’re sad. They stay sad. I’m not concerned with their 20-year-old self. I’m worried about them at my age, dealing with breast cancer, their dad dying, their child being an addict, a career that never worked out — so when they’re dealing with the bigger things in life, they know they’re not alone.”
  • Both courses have long wait lists. Students are hungry for a low-tech, introspective experience —
  • Research suggests that underprivileged young people have far fewer opportunities to think for unbroken stretches of time, so they may need even more space in college to develop what social scientists call cognitive endurance.
  • Yet the most visible higher ed trends are moving in the other direction
  • Rather than ban phones and laptops from class, some professors are brainstorming ways to embrace students’ tech addictions with class Facebook and Instagram accounts, audience response apps — and perhaps even including the friends and relatives whom students text during class as virtual participants in class discussion.
  • Then there’s that other unwelcome classroom visitor: artificial intelligence.
  • stop worrying and love the bot by designing assignments that “help students develop their prompting skills” or “use ChatGPT to generate a first draft,” according to a tip sheet produced by the Center for Teaching and Learning at Washington University in St. Louis.
  • It’s not at all clear that we want a future dominated by A.I.’s amoral, Cheez Whiz version of human thought
  • It is abundantly clear that texting, tagging and chatbotting are making students miserable right now.
  • One recent national survey found that 60 percent of American college students reported the symptoms of at least one mental health problem and that 15 percent said they were considering suicide
  • A recent meta-analysis of 36 studies of college students’ mental health found a significant correlation between longer screen time and higher risk of anxiety and depression
  • And while social media can sometimes help suffering students connect with peers, research on teenagers and college students suggests that overall, the support of a virtual community cannot compensate for the vortex of gossip, bullying and Instagram posturing that is bound to rot any normal person’s self-esteem.
  • We need an intervention: maybe not a vow of silence but a bold move to put the screens, the pinging notifications and creepy humanoid A.I. chatbots in their proper place
  • it does mean selectively returning to the university’s roots in the monastic schools of medieval Europe and rekindling the old-fashioned quest for meaning.
  • Colleges should offer a radically low-tech first-year program for students who want to apply: a secular monastery within the modern university, with a curated set of courses that ban glowing rectangles of any kind from the classroom
  • Students could opt to live in dorms that restrict technology, too
  • I prophesy that universities that do this will be surprised by how much demand there is. I frequently talk to students who resent the distracting laptops all around them during class. They feel the tug of the “imaginary string attaching me to my phone, where I have to constantly check it,”
  • Many, if not most, students want the elusive experience of uninterrupted thought, the kind where a hash of half-baked notions slowly becomes an idea about the world.
  • Even if your goal is effective use of the latest chatbot, it behooves you to read books in hard copies and read enough of them to learn what an elegant paragraph sounds like. How else will students recognize when ChatGPT churns out decent prose instead of bureaucratic drivel?
  • Most important, students need head space to think about their ultimate values.
  • His course offers a chance to temporarily exchange those unconscious structures for a set of deliberate, countercultural ones.
  • here are the student learning outcomes universities should focus on: cognitive endurance and existential clarity.
  • Contemplation and marathon reading are not ends in themselves or mere vacations from real life but are among the best ways to figure out your own answer to the question of what a human being is for
  • When students finish, they can move right into their area of specialization and wire up their skulls with all the technology they want, armed with the habits and perspective to do so responsibly
  • it’s worth learning from the radicals. Dr. McDaniel, the religious studies professor at Penn, has a long history with different monastic traditions. He grew up in Philadelphia, educated by Hungarian Catholic monks. After college, he volunteered in Thailand and Laos and lived as a Buddhist monk.
  • e found that no amount of academic reading could help undergraduates truly understand why “people voluntarily take on celibacy, give up drinking and put themselves under authorities they don’t need to,” he told me. So for 20 years, he has helped students try it out — and question some of their assumptions about what it means to find themselves.
  • “On college campuses, these students think they’re all being individuals, going out and being wild,” he said. “But they’re in a playpen. I tell them, ‘You know you’ll be protected by campus police and lawyers. You have this entire apparatus set up for you. You think you’re being an individual, but look at your four friends: They all look exactly like you and sound like you. We exist in these very strict structures we like to pretend don’t exist.’”
  • Colleges could do all this in classes integrated with general education requirements: ideally, a sequence of great books seminars focused on classic texts from across different civilizations.
  • “For the last 1,500 years, Benedictines have had to deal with technology,” Placid Solari, the abbot there, told me. “For us, the question is: How do you use the tool so it supports and enhances your purpose or mission and you don’t get owned by it?”
  • for novices at his monastery, “part of the formation is discipline to learn how to control technology use.” After this initial time of limited phone and TV “to wean them away from overdependence on technology and its stimulation,” they get more access and mostly make their own choices.
  • Evan Lutz graduated this May from Belmont Abbey with a major in theology. He stressed the special Catholic context of Belmont’s resident monks; if you experiment with monastic practices without investigating the whole worldview, it can become a shallow kind of mindfulness tourism.
  • The monks at Belmont Abbey do more than model contemplation and focus. Their presence compels even non-Christians on campus to think seriously about vocation and the meaning of life. “Either what the monks are doing is valuable and based on something true, or it’s completely ridiculous,” Mr. Lutz said. “In both cases, there’s something striking there, and it asks people a question.”
  • Pondering ultimate questions and cultivating cognitive endurance should not be luxury goods.
  • David Peña-Guzmán, who teaches philosophy at San Francisco State University, read about Dr. McDaniel’s Existential Despair course and decided he wanted to create a similar one. He called it the Reading Experiment. A small group of humanities majors gathered once every two weeks for five and a half hours in a seminar room equipped with couches and a big round table. They read authors ranging from Jean-Paul Sartre to Frantz Fanon
  • “At the beginning of every class I’d ask students to turn off their phones and put them in ‘the Basket of Despair,’ which was a plastic bag,” he told me. “I had an extended chat with them about accessibility. The point is not to take away the phone for its own sake but to take away our primary sources of distraction. Students could keep the phone if they needed it. But all of them chose to part with their phones.”
  • Dr. Peña-Guzmán’s students are mostly working-class, first-generation college students. He encouraged them to be honest about their anxieties by sharing his own: “I said, ‘I’m a very slow reader, and it’s likely some or most of you will get further in the text than me because I’m E.S.L. and read quite slowly in English.’
  • For his students, the struggle to read long texts is “tied up with the assumption that reading can happen while multitasking and constantly interacting with technologies that are making demands on their attention, even at the level of a second,”
  • “These draw you out of the flow of reading. You get back to the reading, but you have to restart the sentence or even the paragraph. Often, because of these technological interventions into the reading experience, students almost experience reading backward — as constant regress, without any sense of progress. The more time they spend, the less progress they make.”
  • Dr. Peña-Guzmán dismissed the idea that a course like his is suitable only for students who don’t have to worry about holding down jobs or paying off student debt. “I’m worried by this assumption that certain experiences that are important for the development of personality, for a certain kind of humanistic and spiritual growth, should be reserved for the elite, especially when we know those experiences are also sources of cultural capital,
  • Courses like the Reading Experiment are practical, too, he added. “I can’t imagine a field that wouldn’t require some version of the skill of focused attention.”
  • The point is not to reject new technology but to help students retain the upper hand in their relationship with i
  • Ms. Rodriguez said that before she took Living Deliberately and Existential Despair, she didn’t distinguish technology from education. “I didn’t think education ever went without technology. I think that’s really weird now. You don’t need to adapt every piece of technology to be able to learn better or more,” she said. “It can form this dependency.”
  • The point of college is to help students become independent humans who can choose the gods they serve and the rules they follow rather than allow someone else to choose for them
  • The first step is dethroning the small silicon idol in their pocket — and making space for the uncomfortable silence and questions that follow
Javier E

Book Review: 'The Maniac,' by Benjamín Labatut - The New York Times - 0 views

  • it quickly becomes clear that what “The Maniac” is really trying to get a lock on is our current age of digital-informational mastery and subjection
  • When von Neumann proclaims that, thanks to his computational advances, “all processes that are stable we shall predict” and “all processes that are unstable we shall control,” we’re being prompted to reflect on today’s ubiquitous predictive-slash-determinative algorithms.
  • When he publishes a paper about the feasibility of a self-reproducing machine — “you need to have a mechanism, not only of copying a being, but of copying the instructions that specify that being” — few contemporary readers will fail to home straight in on the fraught subject of A.I.
  • ...9 more annotations...
  • Haunting von Neumann’s thought experiment is the specter of a construct that, in its very internal perfection, lacks the element that would account for itself as a construct. “If someone succeeded in creating a formal system of axioms that was free of all internal paradoxes and contradictions,” another of von Neumann’s interlocutors, the logician Kurt Gödel, explains, “it would always be incomplete, because it would contain truths and statements that — while being undeniably true — could never be proven within the laws of that system.”
  • its deeper (and, for me, more compelling) theme: the relation between reason and madness.
  • Almost all the scientists populating the book are mad, their desire “to understand, to grasp the core of things” invariably wedded to “an uncontrollable mania”; even their scrupulously observed reason, their mode of logic elevated to religion, is framed as a form of madness. Von Neumann’s response to the detonation of the Trinity bomb, the world’s first nuclear explosion, is “so utterly rational that it bordered on the psychopathic,” his second wife, Klara Dan, muses
  • fanaticism, in the 1930s, “was the norm … even among us mathematicians.”
  • Pondering Gödel’s own descent into mania, the physicist Eugene Wigner claims that “paranoia is logic run amok.” If you’ve convinced yourself that there’s a reason for everything, “it’s a small step to begin to see hidden machinations and agents operating to manipulate the most common, everyday occurrences.”
  • the game theory-derived system of mutually assured destruction he devises in its wake is “perfectly rational insanity,” according to its co-founder Oskar Morgenstern.
  • Labatut has Morgenstern end his MAD deliberations by pointing out that humans are not perfect poker players. They are irrational, a fact that, while instigating “the ungovernable chaos that we see all around us,” is also the “mercy” that saves us, “a strange angel that protects us from the mad dreams of reason.”
  • But does von Neumann really deserve the title “Father of Computers,” granted him here by his first wife, Mariette Kovesi? Doesn’t Ada Lovelace have a prior claim as their mother? Feynman’s description of the Trinity bomb as “a little Frankenstein monster” should remind us that it was Mary Shelley, not von Neumann and his coterie, who first grasped the monumental stakes of modeling the total code of life, its own instructions for self-replication, and that it was Rosalind Franklin — working alongside, not under, Maurice Wilkins — who first carried out this modeling.
  • he at least grants his women broader, more incisive wisdom. Ehrenfest’s lover Nelly Posthumus Meyjes delivers a persuasive lecture on the Pythagorean myth of the irrational, suggesting that while scientists would never accept the fact that “nature cannot be cognized as a whole,” artists, by contrast, “had already fully embraced it.”
Javier E

The Age of Social Media Is Ending - The Atlantic - 0 views

  • Slowly and without fanfare, around the end of the aughts, social media took its place. The change was almost invisible, but it had enormous consequences. Instead of facilitating the modest use of existing connections—largely for offline life (to organize a birthday party, say)—social software turned those connections into a latent broadcast channel. All at once, billions of people saw themselves as celebrities, pundits, and tastemakers.
  • A global broadcast network where anyone can say anything to anyone else as often as possible, and where such people have come to think they deserve such a capacity, or even that withholding it amounts to censorship or suppression—that’s just a terrible idea from the outset. And it’s a terrible idea that is entirely and completely bound up with the concept of social media itself: systems erected and used exclusively to deliver an endless stream of content.
  • “social media,” a name so familiar that it has ceased to bear meaning. But two decades ago, that term didn’t exist
  • ...35 more annotations...
  • a “web 2.0” revolution in “user-generated content,” offering easy-to-use, easily adopted tools on websites and then mobile apps. They were built for creating and sharing “content,”
  • As the original name suggested, social networking involved connecting, not publishing. By connecting your personal network of trusted contacts (or “strong ties,” as sociologists call them) to others’ such networks (via “weak ties”), you could surface a larger network of trusted contacts
  • The whole idea of social networks was networking: building or deepening relationships, mostly with people you knew. How and why that deepening happened was largely left to the users to decide.
  • That changed when social networking became social media around 2009, between the introduction of the smartphone and the launch of Instagram. Instead of connection—forging latent ties to people and organizations we would mostly ignore—social media offered platforms through which people could publish content as widely as possible, well beyond their networks of immediate contacts.
  • Social media turned you, me, and everyone into broadcasters (if aspirational ones). The results have been disastrous but also highly pleasurable, not to mention massively profitable—a catastrophic combination.
  • A social network is an idle, inactive system—a Rolodex of contacts, a notebook of sales targets, a yearbook of possible soul mates. But social media is active—hyperactive, really—spewing material across those networks instead of leaving them alone until needed.
  • The authors propose social media as a system in which users participate in “information exchange.” The network, which had previously been used to establish and maintain relationships, becomes reinterpreted as a channel through which to broadcast.
  • The toxicity of social media makes it easy to forget how truly magical this innovation felt when it was new. From 2004 to 2009, you could join Facebook and everyone you’d ever known—including people you’d definitely lost track of—was right there, ready to connect or reconnect. The posts and photos I saw characterized my friends’ changing lives, not the conspiracy theories that their unhinged friends had shared with them
  • Twitter, which launched in 2006, was probably the first true social-media site, even if nobody called it that at the time. Instead of focusing on connecting people, the site amounted to a giant, asynchronous chat room for the world. Twitter was for talking to everyone—which is perhaps one of the reasons journalists have flocked to it
  • on Twitter, anything anybody posted could be seen instantly by anyone else. And furthermore, unlike posts on blogs or images on Flickr or videos on YouTube, tweets were short and low-effort, making it easy to post many of them a week or even a day.
  • soon enough, all social networks became social media first and foremost. When groups, pages, and the News Feed launched, Facebook began encouraging users to share content published by others in order to increase engagement on the service, rather than to provide updates to friends. LinkedIn launched a program to publish content across the platform, too. Twitter, already principally a publishing platform, added a dedicated “retweet” feature, making it far easier to spread content virally across user networks.
  • When we look back at this moment, social media had already arrived in spirit if not by name. RSS readers offered a feed of blog posts to catch up on, complete with unread counts. MySpace fused music and chatter; YouTube did it with video (“Broadcast Yourself”)
  • From being asked to review every product you buy to believing that every tweet or Instagram image warrants likes or comments or follows, social media produced a positively unhinged, sociopathic rendition of human sociality.
  • Other services arrived or evolved in this vein, among them Reddit, Snapchat, and WhatsApp, all far more popular than Twitter. Social networks, once latent routes for possible contact, became superhighways of constant content
  • Although you can connect the app to your contacts and follow specific users, on TikTok, you are more likely to simply plug into a continuous flow of video content that has oozed to the surface via algorithm.
  • In the social-networking era, the connections were essential, driving both content creation and consumption. But the social-media era seeks the thinnest, most soluble connections possible, just enough to allow the content to flow.
  • This is also why journalists became so dependent on Twitter: It’s a constant stream of sources, events, and reactions—a reporting automat, not to mention an outbound vector for media tastemakers to make tastes.
  • “influencer” became an aspirational role, especially for young people for whom Instagram fame seemed more achievable than traditional celebrity—or perhaps employment of any kind.
  • social-media operators discovered that the more emotionally charged the content, the better it spread across its users’ networks. Polarizing, offensive, or just plain fraudulent information was optimized for distribution. By the time the platforms realized and the public revolted, it was too late to turn off these feedback loops.
  • The ensuing disaster was multipar
  • Rounding up friends or business contacts into a pen in your online profile for possible future use was never a healthy way to understand social relationships.
  • when social networking evolved into social media, user expectations escalated. Driven by venture capitalists’ expectations and then Wall Street’s demands, the tech companies—Google and Facebook and all the rest—became addicted to massive scale
  • Social media showed that everyone has the potential to reach a massive audience at low cost and high gain—and that potential gave many people the impression that they deserve such an audience.
  • On social media, everyone believes that anyone to whom they have access owes them an audience: a writer who posted a take, a celebrity who announced a project, a pretty girl just trying to live her life, that anon who said something afflictive
  • When network connections become activated for any reason or no reason, then every connection seems worthy of traversing.
  • people just aren’t meant to talk to one another this much. They shouldn’t have that much to say, they shouldn’t expect to receive such a large audience for that expression, and they shouldn’t suppose a right to comment or rejoinder for every thought or notion either.
  • Facebook and all the rest enjoyed a massive rise in engagement and the associated data-driven advertising profits that the attention-driven content economy created. The same phenomenon also created the influencer economy, in which individual social-media users became valuable as channels for distributing marketing messages or product sponsorships by means of their posts’ real or imagined reach
  • That’s no surprise, I guess, given that the model was forged in the fires of Big Tech companies such as Facebook, where sociopathy is a design philosophy.
  • If change is possible, carrying it out will be difficult, because we have adapted our lives to conform to social media’s pleasures and torments. It’s seemingly as hard to give up on social media as it was to give up smoking en masse
  • Quitting that habit took decades of regulatory intervention, public-relations campaigning, social shaming, and aesthetic shifts. At a cultural level, we didn’t stop smoking just because the habit was unpleasant or uncool or even because it might kill us. We did so slowly and over time, by forcing social life to suffocate the practice. That process must now begin in earnest for social media.
  • Something may yet survive the fire that would burn it down: social networks, the services’ overlooked, molten core. It was never a terrible idea, at least, to use computers to connect to one another on occasion, for justified reasons, and in moderation
  • The problem came from doing so all the time, as a lifestyle, an aspiration, an obsession. The offer was always too good to be true, but it’s taken us two decades to realize the Faustian nature of the bargain.
  • when I first wrote about downscale, the ambition seemed necessary but impossible. It still feels unlikely—but perhaps newly plausible.
  • To win the soul of social life, we must learn to muzzle it again, across the globe, among billions of people. To speak less, to fewer people and less often–and for them to do the same to you, and everyone else as well
  • We cannot make social media good, because it is fundamentally bad, deep in its very structure. All we can do is hope that it withers away, and play our small part in helping abandon it.
Javier E

Where We Went Wrong | Harvard Magazine - 0 views

  • John Kenneth Galbraith assessed the trajectory of America’s increasingly “affluent society.” His outlook was not a happy one. The nation’s increasingly evident material prosperity was not making its citizens any more satisfied. Nor, at least in its existing form, was it likely to do so
  • One reason, Galbraith argued, was the glaring imbalance between the opulence in consumption of private goods and the poverty, often squalor, of public services like schools and parks
  • nother was that even the bountifully supplied private goods often satisfied no genuine need, or even desire; a vast advertising apparatus generated artificial demand for them, and satisfying this demand failed to provide meaningful or lasting satisfaction.
  • ...28 more annotations...
  • economist J. Bradford DeLong ’82, Ph.D. ’87, looking back on the twentieth century two decades after its end, comes to a similar conclusion but on different grounds.
  • DeLong, professor of economics at Berkeley, looks to matters of “contingency” and “choice”: at key junctures the economy suffered “bad luck,” and the actions taken by the responsible policymakers were “incompetent.”
  • these were “the most consequential years of all humanity’s centuries.” The changes they saw, while in the first instance economic, also “shaped and transformed nearly everything sociological, political, and cultural.”
  • DeLong’s look back over the twentieth century energetically encompasses political and social trends as well; nor is his scope limited to the United States. The result is a work of strikingly expansive breadth and scope
  • labeling the book an economic history fails to convey its sweeping frame.
  • The century that is DeLong’s focus is what he calls the “long twentieth century,” running from just after the Civil War to the end of the 2000s when a series of events, including the biggest financial crisis since the 1930s followed by likewise the most severe business downturn, finally rendered the advanced Western economies “unable to resume economic growth at anything near the average pace that had been the rule since 1870.
  • d behind those missteps in policy stood not just failures of economic thinking but a voting public that reacted perversely, even if understandably, to the frustrations poor economic outcomes had brought them.
  • Within this 140-year span, DeLong identifies two eras of “El Dorado” economic growth, each facilitated by expanding globalization, and each driven by rapid advances in technology and changes in business organization for applying technology to economic ends
  • from 1870 to World War I, and again from World War II to 197
  • fellow economist Robert J. Gordon ’62, who in his monumental treatise on The Rise and Fall of American Economic Growth (reviewed in “How America Grew,” May-June 2016, page 68) hailed 1870-1970 as a “special century” in this regard (interrupted midway by the disaster of the 1930s).
  • Gordon highlighted the role of a cluster of once-for-all-time technological advances—the steam engine, railroads, electrification, the internal combustion engine, radio and television, powered flight
  • Pessimistic that future technological advances (most obviously, the computer and electronics revolutions) will generate productivity gains to match those of the special century, Gordon therefore saw little prospect of a return to the rapid growth of those halcyon days.
  • DeLong instead points to a series of noneconomic (and non-technological) events that slowed growth, followed by a perverse turn in economic policy triggered in part by public frustration: In 1973 the OPEC cartel tripled the price of oil, and then quadrupled it yet again six years later.
  • For all too many Americans (and citizens of other countries too), the combination of high inflation and sluggish growth meant that “social democracy was no longer delivering the rapid progress toward utopia that it had delivered in the first post-World War II generation.”
  • Frustration over these and other ills in turn spawned what DeLong calls the “neoliberal turn” in public attitudes and economic policy. The new economic policies introduced under this rubric “did not end the slowdown in productivity growth but reinforced it.
  • the tax and regulatory changes enacted in this new climate channeled most of what economic gains there were to people already at the top of the income scale
  • Meanwhile, progressive “inclusion” of women and African Americans in the economy (and in American society more broadly) meant that middle- and lower-income white men saw even smaller gains—and, perversely, reacted by providing still greater support for policies like tax cuts for those with far higher incomes than their own.
  • Daniel Bell’s argument in his 1976 classic The Cultural Contradictions of Capitalism. Bell famously suggested that the very success of a capitalist economy would eventually undermine a society’s commitment to the values and institutions that made capitalism possible in the first plac
  • In DeLong’s view, the “greatest cause” of the neoliberal turn was “the extraordinary pace of rising prosperity during the Thirty Glorious Years, which raised the bar that a political-economic order had to surpass in order to generate broad acceptance.” At the same time, “the fading memory of the Great Depression led to the fading of the belief, or rather recognition, by the middle class that they, as well as the working class, needed social insurance.”
  • what the economy delivered to “hard-working white men” no longer matched what they saw as their just deserts: in their eyes, “the rich got richer, the unworthy and minority poor got handouts.”
  • As Bell would have put it, the politics of entitlement, bred by years of economic success that so many people had come to take for granted, squeezed out the politics of opportunity and ambition, giving rise to the politics of resentment.
  • The new era therefore became “a time to question the bourgeois virtues of hard, regular work and thrift in pursuit of material abundance.”
  • DeLong’s unspoken agenda would surely include rolling back many of the changes made in the U.S. tax code over the past half-century, as well as reinvigorating antitrust policy to blunt the dominance, and therefore outsize profits, of the mega-firms that now tower over key sectors of the economy
  • He would also surely reverse the recent trend moving away from free trade. Central bankers should certainly behave like Paul Volcker (appointed by President Carter), whose decisive action finally broke the 1970s inflation even at considerable economic cost
  • Not only Galbraith’s main themes but many of his more specific observations as well seem as pertinent, and important, today as they did then.
  • What will future readers of Slouching Towards Utopia conclude?
  • If anything, DeLong’s narratives will become more valuable as those events fade into the past. Alas, his description of fascism as having at its center “a contempt for limits, especially those implied by reason-based arguments; a belief that reality could be altered by the will; and an exaltation of the violent assertion of that will as the ultimate argument” will likely strike a nerve with many Americans not just today but in years to come.
  • what about DeLong’s core explanation of what went wrong in the latter third of his, and our, “long century”? I predict that it too will still look right, and important.
Javier E

Dispute Within Art Critics Group Over Diversity Reveals a Widening Rift - The New York ... - 0 views

  • Amussen, 33, is the editor of Burnaway, which focuses on criticism in the American South and often features young Black artists. (The magazine started in 2008 in response to layoffs at the Atlanta Journal-Constitution’s culture section and now runs as a nonprofit with four full-time employees and a budget that mostly consists of grants.)
  • Efforts to revive AICA-USA are continuing. In January, Jasmine Amussen joined the organization’s board to help rethink the meaning of criticism for a younger generation.
  • The organization has yearly dues of $115 and provides free access to many museums. But some members complained that the fee was too expensive for young critics, yet not enough to support significant programming.
  • ...12 more annotations...
  • “It just came down to not having enough money,” said Terence Trouillot, a senior editor at Frieze, a contemporary art magazine . He spent nearly three years on the AICA-USA board, resigning in 2022. He said that initiatives to re-energize the group “were just moving too slowly.”
  • According to Lilly Wei, a longtime AICA-USA board member who recently resigned, the group explored different ways of protecting writers in the industry. There were unrealized plans of turning the organization into a union; others hoped to create a permanent emergency fund to keep financially struggling critics afloat. She said the organization has instead canceled initiatives, including an awards program for the best exhibitions across the country.
  • Large galleries — including Gagosian, Hauser & Wirth, and Pace Gallery — now produce their own publications with interviews and articles sometimes written by the same freelance critics who simultaneously moonlight as curators and marketers. Within its membership, AICA-USA has a number of writers who belong to all three categories.
  • “It’s crazy that the ideal job nowadays is producing catalog essays for galleries, which are basically just sales pitches,” Dillon said in a phone interview. “Critical thinking about art is not valued financially.”
  • Noah Dillon, who was on the AICA-USA board until he resigned last year, has been reluctant to recommend that anyone follow his path to become a critic. Not that they could. The graduate program in art writing that he attended at the School of Visual Arts in Manhattan also closed during the pandemic.
  • David Velasco, editor in chief of Artforum, said in an interview that he hoped the magazine’s acquisition would improve the publication’s financial picture. The magazine runs nearly 700 reviews a year, Velasco said; about half of those run online and pay $50 for roughly 250 words. “Nobody I know who knows about art does it for the money,” Velasco said, “but I would love to arrive at a point where people could.”
  • While most editors recognize the importance of criticism in helping readers decipher contemporary art, and the multibillion-dollar industry it has created, venues for such writing are shrinking. Over the years, newspapers including The Philadelphia Inquirer and The Miami Herald have trimmed critics’ jobs.
  • In December, the Penske Media Corporation announced that it had acquired Artforum, a contemporary art journal, and was bringing the title under the same ownership as its two competitors, ARTnews and Art in America. Its sister publication, Bookforum, was not acquired and ceased operations. Through the pandemic, other outlets have shuttered, including popular blogs run by SFMOMA and the Walker Art Center in Minneapolis as well as smaller magazines called Astra and Elephant.
  • The need for change in museums was pointed out in the 2022 Burns Halperin Report, published by Artnet News in December, that analyzed more than a decade of data from over 30 cultural institutions. It found that just 11 percent of acquisitions at U.S. museums were by female artists and only 2.2 percent were by Black American artists
  • (National newspapers with art critics on staff include The New York Times, The Los Angeles Times, The Boston Globe and The Washington Post. )
  • Julia Halperin, one of the study’s organizers, who recently left her position as Artnet’s executive editor, said that the industry has an asymmetric approach to diversity. “The pool of artists is diversifying somewhat, but the pool of staff critics has not,” she said.
  • the matter of diversity in criticism is compounded by the fact that opportunities for all critics have been diminished.
Javier E

Opinion | Where Have all the Adults in Children's Books Gone? - The New York Times - 0 views

  • Some might see the entrenchment of child-centeredness in children’s literature as reinforcing what some social critics consider a rising tide of narcissism in young people today. But to be fair: Such criticisms of youth transcend the ages.
  • What is certainly true now is the primacy of “mirrors and windows,” a philosophy that strives to show children characters who reflect how they look back to them, as well as those from different backgrounds, mostly with an eye to diversity.
  • This is a noble goal, but those mirrors and windows should apply to adults as well. Adults are, after all, central figures in children’s lives — their parents and caregivers, their teachers, their role models
  • ...7 more annotations...
  • . The implicit lesson is that grown-ups aren’t infallible. It’s OK to laugh at them and it’s OK to feel compassion for them and it’s even OK to feel sorry for them on occasion.
  • The adult figures in children’s literature are also frequently outsiders or eccentrics in some way, and quite often subject to ridicule
  • yes, adults are often the Other — which makes them a mystery and a curiosity. Literature offers insight into these occasionally intimidating creatures.
  • In real life, children revere adults and they fear them. It only follows, then, that they appreciate when adult characters behave admirably but also delight in seeing the consequences — especially when rendered with humor — when they don’t.
  • Nursery rhymes, folk tales, myths and legends overwhelmingly cast adults as their central characters — and have endured for good reason
  • In somewhat later tales, children investigated crimes alongside Sherlock Holmes, adventured through Narnia, inhabited Oz and traversed Middle-earth. Grown-up heroes can be hobbits, or rabbits (“Watership Down”), badgers or moles (“The Wind in the Willows”). Children join them no matter what because they like to be in league with their protagonists and by extension, their authors.
  • In children’s books with adult heroes, children get to conspire alongside their elders. Defying the too-often adversarial relationship between adults and children in literature, such books enable children to see that adults are perfectly capable of occupying their shared world with less antagonism — as partners in life, in love and in adventure.
Javier E

(1) A Brief History of Media and Audiences and Twitter and The Bulwark - 0 views

  • In the old days—and here I mean even as recently as 2000 or 2004—audiences were built around media institutions. The New York Times had an audience. The New Yorker had an audience. The Weekly Standard had an audience.
  • If you were a writer, you got access to these audiences by contributing to the institutions. No one cared if you, John Smith, wrote a piece about Al Gore. But if your piece about Al Gore appeared in Washington Monthly, then suddenly you had an audience.
  • There were a handful of star writers for whom this wasn’t true: Maureen Dowd, Tom Wolfe, Joan Didion. Readers would follow these stars wherever they appeared. But they were the exceptions to the rule. And the only way to ascend to such exalted status was by writing a lot of great pieces for established institutions and slowly assembling your audience from theirs.
  • ...16 more annotations...
  • The internet stripped institutions of their gatekeeping powers, thus making it possible for anyone to publish—and making it inevitable that many writers would create audiences independent of media institutions.
  • The internet destroyed the apprenticeship system that had dominated American journalism for generations. Under the old system, an aspiring writer took a low-level job at a media institution and worked her way up the ladder until she was trusted enough to write.
  • Under the new system, people started their careers writing outside of institutions—on personal blogs—and then were hired by institutions on the strength of their work.
  • In practice, these outsiders were primarily hired not on the merits of their work, but because of the size of their audience.
  • what it really did was transform the nature of audiences. Once the internet existed it became inevitable that institutions would see their power to hold audiences wane while individual writers would have their power to build personal audiences explode.
  • this meant that institutions would begin to hire based on the size of a writer’s audience. Which meant that writers’ overriding professional imperative was to build an audience, since that was the key to advancement.
  • Twitter killed the blog and lowered the barrier to entry for new writers from “Must have a laptop, the ability to navigate WordPress, and the capacity to write paragraphs” to “Do you have an iPhone and the ability to string 20 words together? With or without punctuation?”
  • If you were able to build a big enough audience on Twitter, then media institutions fell all over themselves trying to hire you—because they believed that you would then bring your audience to them.2
  • If you were a writer for the Washington Post, or Wired, or the Saginaw Express, you had to build your own audience not to advance, but to avoid being replaced.
  • For journalists, audience wasn’t just status—it was professional capital. In fact, it was the most valuable professional capital.
  • Everything we just talked about was driven by the advertising model of media, which prized pageviews and unique users above all else. About a decade ago, that model started to fray around the edges,3 which caused a shift to the subscription model.
  • Today, if you’re a subscription publication, what Twitter gives you is growth opportunity. Twitter’s not the only channel for growth—there are lots of others, from TikTok to LinkedIn to YouTube to podcasts to search. But it’s an important one.
  • Twitter’s attack on Substack was an attack on the subscription model of journalism itself.
  • since media has already seen the ad-based model fall apart, it’s not clear what the alternative will be if the subscription model dies, too.
  • All of which is why having a major social media platform run by a capricious bad actor is suboptimal.
  • And why I think anyone else who’s concerned about the future of media ought to start hedging against Twitter. None of the direct hedges—Post, Mastodon, etc.—are viable yet. But tech history shows that these shifts can happen fairly quickly.
Javier E

A Leading Memory Researcher Explains How to Make Precious Moments Last - The New York T... - 0 views

  • Our memories form the bedrock of who we are. Those recollections, in turn, are built on one very simple assumption: This happened. But things are not quite so simple
  • “We update our memories through the act of remembering,” says Charan Ranganath, a professor of psychology and neuroscience at the University of California, Davis, and the author of the illuminating new book “Why We Remember.” “So it creates all these weird biases and infiltrates our decision making. It affects our sense of who we are.
  • Rather than being photo-accurate repositories of past experience, Ranganath argues, our memories function more like active interpreters, working to help us navigate the present and future. The implication is that who we are, and the memories we draw on to determine that, are far less fixed than you might think. “Our identities,” Ranganath says, “are built on shifting sand.”
  • ...24 more annotations...
  • People believe that memory should be effortless, but their expectations for how much they should remember are totally out of whack with how much they’re capable of remembering.1
  • What is the most common misconception about memory?
  • Another misconception is that memory is supposed to be an archive of the past. We expect that we should be able to replay the past like a movie in our heads.
  • we don’t replay the past as it happened; we do it through a lens of interpretation and imagination.
  • How much are we capable of remembering, from both an episodic2 2 Episodic memory is the term for the memory of life experiences. and a semantic3 3 Semantic memory is the term for the memory of facts and knowledge about the world. standpoint?
  • I would argue that we’re all everyday-memory experts, because we have this exceptional semantic memory, which is the scaffold for episodic memory.
  • If what we’re remembering, or the emotional tenor of what we’re remembering, is dictated by how we’re thinking in a present moment, what can we really say about the truth of a memory?
  • But if memories are malleable, what are the implications for how we understand our “true” selves?
  • your question gets to a major purpose of memory, which is to give us an illusion of stability in a world that is always changing. Because if we look for memories, we’ll reshape them into our beliefs of what’s happening right now. We’ll be biased in terms of how we sample the past. We have these illusions of stability, but we are always changing
  • And depending on what memories we draw upon, those life narratives can change.
  • I know it sounds squirmy to say, “Well, I can’t answer the question of how much we remember,” but I don’t want readers to walk away thinking memory is all made up.
  • One thing that makes the human brain so sophisticated is that we have a longer timeline in which we can integrate information than many other species. That gives us the ability to say: “Hey, I’m walking up and giving money to the cashier at the cafe. The barista is going to hand me a cup of coffee in about a minute or two.”
  • There is this illusion that we know exactly what’s going to happen, but the fact is we don’t. Memory can overdo it: Somebody lied to us once, so they are a liar; somebody shoplifted once, they are a thief.
  • If people have a vivid memory of something that sticks out, that will overshadow all their knowledge about the way things work. So there’s kind of an illus
  • we have this illusion that much of the world is cause and effect. But the reason, in my opinion, that we have that illusion is that our brain is constantly trying to find the patterns
  • I think of memory more like a painting than a photograph. There’s often photorealistic aspects of a painting, but there’s also interpretation. As a painter evolves, they could revisit the same subject over and over and paint differently based on who they are now. We’re capable of remembering things in extraordinary detail, but we infuse meaning into what we remember. We’re designed to extract meaning from the past, and that meaning should have truth in it. But it also has knowledge and imagination and, sometimes, wisdom.
  • memory, often, is educated guesses by the brain about what’s important. So what’s important? Things that are scary, things that get your desire going, things that are surprising. Maybe you were attracted to this person, and your eyes dilated, your pulse went up. Maybe you were working on something in this high state of excitement, and your dopamine was up.
  • It could be any of those things, but they’re all important in some way, because if you’re a brain, you want to take what’s surprising, you want to take what’s motivationally important for survival, what’s new.
  • On the more intentional side, are there things that we might be able to do in the moment to make events last in our memories? In some sense, it’s about being mindful. If we want to form a new memory, focus on aspects of the experience you want to take with you.
  • If you’re with your kid, you’re at a park, focus on the parts of it that are great, not the parts that are kind of annoying. Then you want to focus on the sights, the sounds, the smells, because those will give you rich detail later on
  • Another part of it, too, is that we kill ourselves by inducing distractions in our world. We have alerts on our phones. We check email habitually.
  • When we go on trips, I take candid shots. These are the things that bring you back to moments. If you capture the feelings and the sights and the sounds that bring you to the moment, as opposed to the facts of what happened, that is a huge part of getting the best of memory.
  • this goes back to the question of whether the factual truth of a memory matters to how we interpret it. I think it matters to have some truth, but then again, many of the truths we cling to depend on our own perspective.
  • There’s a great experiment on this. These researchers had people read this story about a house.8 8 The study was “Recall of Previously Unrecallable Information Following a Shift in Perspective,” by Richard C. Anderson and James W. Pichert. One group of subjects is told, I want you to read this story from the perspective of a prospective home buyer. When they remember it, they remember all the features of the house that are described in the thing. Another group is told, I want you to remember this from the perspective of a burglar. Those people tend to remember the valuables in the house and things that you would want to take. But what was interesting was then they switched the groups around. All of a sudden, people could pull up a number of details that they didn’t pull up before. It was always there, but they just didn’t approach it from that mind-set. So we do have a lot of information that we can get if we change our perspective, and this ability to change our perspective is exceptionally important for being accurate. It’s exceptionally important for being able to grow and modify our beliefs
« First ‹ Previous 141 - 158 of 158
Showing 20 items per page