Skip to main content

Home/ TOK Friends/ Group items matching "archives" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Opinion | What's Ripping American Families Apart? - The New York Times - 0 views

  • At least 27 percent of Americans are estranged from a member of their own family, and research suggests about 40 percent of Americans have experienced estrangement at some point.
  • The most common form of estrangement is between adult children and one or both parents — a cut usually initiated by the child. A study published in 2010 found that parents in the U.S. are about twice as likely to be in a contentious relationship with their adult children as parents in Israel, Germany, England and Spain.
  • the children in these cases often cite harsh parenting, parental favoritism, divorce and poor and increasingly hostile communication often culminating in a volcanic event
  • ...14 more annotations...
  • The parents in these cases are often completely bewildered by the accusations. They often remember a totally different childhood home and accuse their children of rewriting what happened.
  • a more individualistic culture has meant that the function of family has changed. Once it was seen as a bond of mutual duty and obligation, and now it is often seen as a launchpad for personal fulfillment. There’s more permission to cut off people who seem toxic in your life.
  • There’s a lot of real emotional abuse out there, but as Coleman put it in an essay in The Atlantic, “My recent research — and my clinical work over the past four decades — has shown me that you can be a conscientious parent and your kid may still want nothing to do with you when they’re older.”
  • Either way, there’s a lot of agony for all concerned. The children feel they have to live with the legacy of an abusive childhood. The parents feel rejected by the person they love most in the world, their own child, and they are powerless to do anything about it. There’s anger, grief and depression on all sides — painful holidays and birthdays — plus, the next generation often grows up without knowing their grandparents.
  • there seems to be a generational shift in what constitutes abuse. Practices that seemed like normal parenting to one generation are conceptualized as abusive, overbearing and traumatizing to another.
  • Becca Bland, founder of the British support and advocacy group Stand Alone, told the BBC: “Now I can put my needs first rather than trying to fix things beyond my control. But, yes, I’m angry I didn’t get the mother I wanted.”
  • A 2012 survey from the Institute for Advanced Studies in Culture found that almost three-quarters of parents of school-age kids said they eventually want to become their children’s best friend.
  • Some kids seem to think they need to cut off their parents just to have their own life. “My mom is really needy and I just don’t need that in my life,
  • In other cases, children may be blaming their parents for the fact that they are not succeeding as they had hoped — it’s Mom and Dad who screwed me up.
  • it feels like a piece of what seems to be the psychological unraveling of America
  • Terrible trends are everywhere. Major depression rates among youths aged 12 to 17 rose by almost 63 percent between 2013 and 2016. American suicide rates increased by 33 percent between 1999 and 2019. The percentage of Americans who say they have no close friends has quadrupled since 1990,
  • Fifty-four percent of Americans report sometimes or always feeling that no one knows them well, according to a 2018 Ipsos survey.
  • political tribalism becomes a mechanism with which people can shore themselves up, vanquish shame, fight for righteousness and find a sense of belonging.
  • if we do not transform our pain, we will most assuredly transmit it.
Javier E

Opinion | You Are the Object of Facebook's Secret Extraction Operation - The New York Times - 0 views

  • Facebook is not just any corporation. It reached trillion-dollar status in a single decade by applying the logic of what I call surveillance capitalism — an economic system built on the secret extraction and manipulation of human data
  • Facebook and other leading surveillance capitalist corporations now control information flows and communication infrastructures across the world.
  • These infrastructures are critical to the possibility of a democratic society, yet our democracies have allowed these companies to own, operate and mediate our information spaces unconstrained by public law.
  • ...56 more annotations...
  • The result has been a hidden revolution in how information is produced, circulated and acted upon
  • The world’s liberal democracies now confront a tragedy of the “un-commons.” Information spaces that people assume to be public are strictly ruled by private commercial interests for maximum profit.
  • The internet as a self-regulating market has been revealed as a failed experiment. Surveillance capitalism leaves a trail of social wreckage in its wake: the wholesale destruction of privacy, the intensification of social inequality, the poisoning of social discourse with defactualized information, the demolition of social norms and the weakening of democratic institutions.
  • These social harms are not random. They are tightly coupled effects of evolving economic operations. Each harm paves the way for the next and is dependent on what went before.
  • There is no way to escape the machine systems that surveil u
  • All roads to economic and social participation now lead through surveillance capitalism’s profit-maximizing institutional terrain, a condition that has intensified during nearly two years of global plague.
  • Will Facebook’s digital violence finally trigger our commitment to take back the “un-commons”?
  • Will we confront the fundamental but long ignored questions of an information civilization: How should we organize and govern the information and communication spaces of the digital century in ways that sustain and advance democratic values and principles?
  • Mark Zuckerberg’s start-up did not invent surveillance capitalism. Google did that. In 2000, when only 25 percent of the world’s information was stored digitally, Google was a tiny start-up with a great search product but little revenue.
  • By 2001, in the teeth of the dot-com bust, Google’s leaders found their breakthrough in a series of inventions that would transform advertising. Their team learned how to combine massive data flows of personal information with advanced computational analyses to predict where an ad should be placed for maximum “click through.”
  • Google’s scientists learned how to extract predictive metadata from this “data exhaust” and use it to analyze likely patterns of future behavior.
  • Prediction was the first imperative that determined the second imperative: extraction.
  • Lucrative predictions required flows of human data at unimaginable scale. Users did not suspect that their data was secretly hunted and captured from every corner of the internet and, later, from apps, smartphones, devices, cameras and sensors
  • User ignorance was understood as crucial to success. Each new product was a means to more “engagement,” a euphemism used to conceal illicit extraction operations.
  • When asked “What is Google?” the co-founder Larry Page laid it out in 2001,
  • “Storage is cheap. Cameras are cheap. People will generate enormous amounts of data,” Mr. Page said. “Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”
  • Instead of selling search to users, Google survived by turning its search engine into a sophisticated surveillance medium for seizing human data
  • Company executives worked to keep these economic operations secret, hidden from users, lawmakers, and competitors. Mr. Page opposed anything that might “stir the privacy pot and endanger our ability to gather data,” Mr. Edwards wrote.
  • As recently as 2017, Eric Schmidt, the executive chairman of Google’s parent company, Alphabet, acknowledged the role of Google’s algorithmic ranking operations in spreading corrupt information. “There is a line that we can’t really get across,” he said. “It is very difficult for us to understand truth.” A company with a mission to organize and make accessible all the world’s information using the most sophisticated machine systems cannot discern corrupt information.
  • This is the economic context in which disinformation wins
  • In March 2008, Mr. Zuckerberg hired Google’s head of global online advertising, Sheryl Sandberg, as his second in command. Ms. Sandberg had joined Google in 2001 and was a key player in the surveillance capitalism revolution. She led the build-out of Google’s advertising engine, AdWords, and its AdSense program, which together accounted for most of the company’s $16.6 billion in revenue in 2007.
  • A Google multimillionaire by the time she met Mr. Zuckerberg, Ms. Sandberg had a canny appreciation of Facebook’s immense opportunities for extraction of rich predictive data. “We have better information than anyone else. We know gender, age, location, and it’s real data as opposed to the stuff other people infer,” Ms. Sandberg explained
  • The company had “better data” and “real data” because it had a front-row seat to what Mr. Page had called “your whole life.”
  • Facebook paved the way for surveillance economics with new privacy policies in late 2009. The Electronic Frontier Foundation warned that new “Everyone” settings eliminated options to restrict the visibility of personal data, instead treating it as publicly available information.
  • Mr. Zuckerberg “just went for it” because there were no laws to stop him from joining Google in the wholesale destruction of privacy. If lawmakers wanted to sanction him as a ruthless profit-maximizer willing to use his social network against society, then 2009 to 2010 would have been a good opportunity.
  • Facebook was the first follower, but not the last. Google, Facebook, Amazon, Microsoft and Apple are private surveillance empires, each with distinct business models.
  • In 2021 these five U.S. tech giants represent five of the six largest publicly traded companies by market capitalization in the world.
  • As we move into the third decade of the 21st century, surveillance capitalism is the dominant economic institution of our time. In the absence of countervailing law, this system successfully mediates nearly every aspect of human engagement with digital information
  • Today all apps and software, no matter how benign they appear, are designed to maximize data collection.
  • Historically, great concentrations of corporate power were associated with economic harms. But when human data are the raw material and predictions of human behavior are the product, then the harms are social rather than economic
  • The difficulty is that these novel harms are typically understood as separate, even unrelated, problems, which makes them impossible to solve. Instead, each new stage of harm creates the conditions for the next stage.
  • Fifty years ago the conservative economist Milton Friedman exhorted American executives, “There is one and only one social responsibility of business — to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.” Even this radical doctrine did not reckon with the possibility of no rules.
  • With privacy out of the way, ill-gotten human data are concentrated within private corporations, where they are claimed as corporate assets to be deployed at will.
  • The sheer size of this knowledge gap is conveyed in a leaked 2018 Facebook document, which described its artificial intelligence hub, ingesting trillions of behavioral data points every day and producing six million behavioral predictions each second.
  • Next, these human data are weaponized as targeting algorithms, engineered to maximize extraction and aimed back at their unsuspecting human sources to increase engagement
  • Targeting mechanisms change real life, sometimes with grave consequences. For example, the Facebook Files depict Mr. Zuckerberg using his algorithms to reinforce or disrupt the behavior of billions of people. Anger is rewarded or ignored. News stories become more trustworthy or unhinged. Publishers prosper or wither. Political discourse turns uglier or more moderate. People live or die.
  • Occasionally the fog clears to reveal the ultimate harm: the growing power of tech giants willing to use their control over critical information infrastructure to compete with democratically elected lawmakers for societal dominance.
  • when it comes to the triumph of surveillance capitalism’s revolution, it is the lawmakers of every liberal democracy, especially in the United States, who bear the greatest burden of responsibility. They allowed private capital to rule our information spaces during two decades of spectacular growth, with no laws to stop it.
  • All of it begins with extraction. An economic order founded on the secret massive-scale extraction of human data assumes the destruction of privacy as a nonnegotiable condition of its business operations.
  • We can’t fix all our problems at once, but we won’t fix any of them, ever, unless we reclaim the sanctity of information integrity and trustworthy communications
  • The abdication of our information and communication spaces to surveillance capitalism has become the meta-crisis of every republic, because it obstructs solutions to all other crises.
  • Neither Google, nor Facebook, nor any other corporate actor in this new economic order set out to destroy society, any more than the fossil fuel industry set out to destroy the earth.
  • like global warming, the tech giants and their fellow travelers have been willing to treat their destructive effects on people and society as collateral damage — the unfortunate but unavoidable byproduct of perfectly legal economic operations that have produced some of the wealthiest and most powerful corporations in the history of capitalism.
  • Where does that leave us?
  • Democracy is the only countervailing institutional order with the legitimate authority and power to change our course. If the ideal of human self-governance is to survive the digital century, then all solutions point to one solution: a democratic counterrevolution.
  • instead of the usual laundry lists of remedies, lawmakers need to proceed with a clear grasp of the adversary: a single hierarchy of economic causes and their social harms.
  • We can’t rid ourselves of later-stage social harms unless we outlaw their foundational economic causes
  • This means we move beyond the current focus on downstream issues such as content moderation and policing illegal content. Such “remedies” only treat the symptoms without challenging the illegitimacy of the human data extraction that funds private control over society’s information spaces
  • Similarly, structural solutions like “breaking up” the tech giants may be valuable in some cases, but they will not affect the underlying economic operations of surveillance capitalism.
  • Instead, discussions about regulating big tech should focus on the bedrock of surveillance economics: the secret extraction of human data from realms of life once called “private.
  • No secret extraction means no illegitimate concentrations of knowledge about people. No concentrations of knowledge means no targeting algorithms. No targeting means that corporations can no longer control and curate information flows and social speech or shape human behavior to favor their interests
  • the sober truth is that we need lawmakers ready to engage in a once-a-century exploration of far more basic questions:
  • How should we structure and govern information, connection and communication in a democratic digital century?
  • What new charters of rights, legislative frameworks and institutions are required to ensure that data collection and use serve the genuine needs of individuals and society?
  • What measures will protect citizens from unaccountable power over information, whether it is wielded by private companies or governments?
  • The corporation that is Facebook may change its name or its leaders, but it will not voluntarily change its economics.
Javier E

The Great PowerPoint Panic of 2003 - The Atlantic - 0 views

  • if all of those bad presentations really led to broad societal ills, the proof is hard to find.
  • Some scientists have tried to take a formal measure of the alleged PowerPoint Effect, asking whether the software really influences our ability to process information. Sebastian Kernbach, a professor of creativity and design at the University of St. Gallen, in Switzerland, has co-authored multiple reviews synthesizing this literature. On the whole, he told me, the research suggests that Tufte was partly right, partly wrong. PowerPoint doesn’t seem to make us stupid—there is no evidence of lower information retention or generalized cognitive decline, for example, among those who use it—but it does impose a set of assumptions about how information ought to be conveyed: loosely, in bullet points, and delivered by presenters to an audience of passive listeners. These assumptions have even reshaped the physical environment for the slide-deck age, Kernbach said: Seminar tables, once configured in a circle, have been bent, post-PowerPoint, into a U-shape to accommodate presenters.
  • When I spoke with Kernbach, he was preparing for a talk on different methods of visual thinking to a group of employees at a large governmental organization. He said he planned to use a flip chart, draw on blank slides like a white board, and perhaps even have audience members do some drawing of their own. But he was also gearing up to use regular old PowerPoint slides. Doing so, he told me, would “signal preparation and professionalism” for his audience. The organization was NASA.
  • ...3 more annotations...
  • The fact that the American space agency still uses PowerPoint should not be surprising. Despite the backlash it inspired in the press, and the bile that it raised in billionaires, and the red alert it caused within the military, the corporate-presentation juggernaut rolls on. The program has more monthly users than ever before, according to Shawn Villaron, Microsoft’s vice president of product for PowerPoint—well into the hundreds of millions. If anything, its use cases have proliferated. During lockdown, people threw PowerPoint parties on Zoom. Kids now make PowerPoint presentations for their parents when they want to get a puppy or quit soccer or attend a Niall Horan meet and greet. If PowerPoint is evil, then evil rules the world.
  • it’s tempting to entertain counterfactuals and wonder how things might have played out if Tufte and the rest of us had worried about social media back in 2003 instead of presentation software. Perhaps a timely pamphlet on The Cognitive Style of Friendster or a Wired headline asserting that “LinkedIn Is Evil” would have changed the course of history. If the social-media backlash of the past few years had been present from the start, maybe Facebook would never have grown into the behemoth it is now, and the country would never have become so hopelessly divided.
  • it could be that nothing whatsoever would have changed. No matter what their timing, and regardless of their aptness, concerns about new media rarely seem to make a difference. Objections get steamrolled. The new technology takes over. And years later, when we look back and think, How strange that we were so perturbed, the effects of that technology may well be invisible.
Javier E

Opinion | Will Translation Apps Make Learning Foreign Languages Obsolete? - The New York Times - 0 views

  • In Europe, nine out of 10 students study a foreign language. In the United States, only one in five do. Between 1997 and 2008, the number of American middle schools offering foreign languages dropped from 75 percent to 58 percent. Between 2009 and 2013, one American college closed its foreign language program; between 2013 and 2017, 651 others did the same.
  • At first glance, these statistics look like a tragedy. But I am starting to harbor the odd opinion that maybe they are not. What is changing my mind is technology.
  • what about spoken language? I was in Belgium not long ago, and I watched various tourists from a variety of nations use instant speech translation apps to render their own languages into English and French. The newer ones can even reproduce the tone of the speaker’s voice; a leading model, iTranslate, publicizes that its Translator app has had 200 million downloads so far.
  • ...12 more annotations...
  • I don’t think these tools will ever render learning foreign languages completely obsolete. Real conversation in the flowing nuances of casual speech cannot be rendered by a program, at least not in a way that would convey full humanity.
  • even if it may fail at genuine, nuanced conversation — for now, at least — technology is eliminating most of the need to learn foreign languages for more utilitarian purposes.
  • The old-school language textbook scenarios, of people reserving hotel rooms or ordering meals in the language of the country they are visiting — “Greetings. Please bring me a glass of lemonade and a sandwich!” — will now be obsolete
  • to actively enjoy piecing together how other languages work is an individual quirk, not a human universal
  • Obsessive language learners have come to call themselves the polyglot community over the past couple of decades, and I am one of them, to an extent. As such, I know well how hard it can be to recognize that most human beings are numb to this peculiar desire.
  • Most human beings are interested much less in how they are saying things, and which language they are saying them in, than in what they are saying.
  • Learning to express this what — beyond the very basics — in another language is hard. It can be especially hard for us Anglophones, as speaking English works at least decently in so many places
  • To polyglots, foreign languages are Mount Everests daring us to climb them — a metaphor used by Hofstadter in his article. But to most people, they are just a barrier to get to the other side of.
  • After all, despite the sincere and admirable efforts of foreign language teachers nationwide, fewer than one in 100 American students become proficient in a language they learned in school.
  • Because I love trying to learn languages and am endlessly fascinated by their varieties and complexities, I am working hard to wrap my head around this new reality. With an iPhone handy and an appropriate app downloaded, foreign languages will no longer present most people with the barrier or challenge they once did
  • Learning to genuinely speak a new language will hardly be unknown. It will continue to beckon, for instance, for those actually relocating to a new country. And it will persist with people who want to engage with literature or media in the original language, as well as those of us who find pleasure in mastering these new codes just because they are “there.”
  • In other words, it will likely become an artisanal pursuit, of interest to a much smaller but more committed set of enthusiasts. And weird as that is, it is in its way a kind of progress.
criscimagnael

Can Forensic Science Be Trusted? - The Atlantic - 0 views

  • When asked, years later, why she had failed to photograph what she said she’d seen on the enhanced bedsheet, Yezzo replied, “This is one time that I didn’t manage to get it soon enough.” She added: “Operator error.”
  • The words were deployed as definitive by prosecutors—“the evidence is uncontroverted by the scientist, totally uncontroverted”
  • Michael Donnelly, now a justice on the Ohio Supreme Court, did not preside over this case, but he has had ample exposure to the use of forensic evidence. “As a trial judge,” he told me, “I sat there for 14 years. And when forensics experts testified, the jury hung on their every word.”
  • ...10 more annotations...
  • Forensic science, which drives the plots of movies and television shows, is accorded great respect by the public. And in the proper hands, it can provide persuasive insight. But in the wrong hands, it can trap innocent people in a vise of seeming inerrancy—and it has done so far too often. What’s more, although some forensic disciplines, such as DNA analysis, are reliable, others have been shown to have serious limitations.
  • Yezzo is not like Annie Dookhan, a chemist in a Massachusetts crime laboratory who boosted her productivity by falsifying reports and by “dry labbing”—that is, reporting results without actually conducting any tests.
  • Nor is Yezzo like Michael West, a forensic odontologist who claimed that he could identify bite marks on a victim and then match those marks to a specific person.
  • The deeper issue with forensic science lies not in malfeasance or corruption—or utter incompetence—but in the gray area where Yezzo can be found. Her alleged personal problems are unusual: Only because of them did the details of her long career come to light.
  • to the point of alignment; how rarely an analyst’s skills are called into question in court; and how seldom the performance of crime labs is subjected to any true oversight.
  • More than half of those exonerated by post-conviction DNA testing had been wrongly convicted based on flawed forensic evidence.
  • The quality of the work done in crime labs is almost never audited.
  • Even the best forensic scientists can fall prey to unintentional bias.
  • Study after study has demonstrated the power of cognitive bias.
  • Cognitive bias can of course affect anyone, in any circumstance—but it is particularly dangerous in a criminal-justice system where forensic scientists have wide latitude as well as some incentive to support the views of prosecutors and the police.
Javier E

Ian Hacking, Eminent Philosopher of Science and Much Else, Dies at 87 - The New York Times - 0 views

  • In an academic career that included more than two decades as a professor in the philosophy department of the University of Toronto, following appointments at Cambridge and Stanford, Professor Hacking’s intellectual scope seemed to know no bounds. Because of his ability to span multiple academic fields, he was often described as a bridge builder.
  • “Ian Hacking was a one-person interdisciplinary department all by himself,” Cheryl Misak, a philosophy professor at the University of Toronto, said in a phone interview. “Anthropologists, sociologists, historians and psychologists, as well as those working on probability theory and physics, took him to have important insights for their disciplines.”
  • Professor Hacking wrote several landmark works on the philosophy and history of probability, including “The Taming of Chance” (1990), which was named one of the best 100 nonfiction books of the 20th century by the Modern Library.
  • ...17 more annotations...
  • “I have long been interested in classifications of people, in how they affect the people classified, and how the effects on the people in turn change the classifications,” he wrote in “Making Up People
  • His work in the philosophy of science was groundbreaking: He departed from the preoccupation with questions that had long concerned philosophers. Arguing that science was just as much about intervention as it was about representation, be helped bring experimentation to center stage.
  • Regarding one such question — whether unseen phenomena like quarks and electrons were real or merely the theoretical constructs of physicists — he argued for reality in the case of phenomena that figured in experiments, citing as an example an experiment at Stanford that involved spraying electrons and positrons into a ball of niobium to detect electric charges. “So far as I am concerned,” he wrote, “if you can spray them, they’re real.”
  • His book “The Emergence of Probability” (1975), which is said to have inspired hundreds of books by other scholars, examined how concepts of statistical probability have evolved over time, shaping the way we understand not just arcane fields like quantum physics but also everyday life.
  • “I was trying to understand what happened a few hundred years ago that made it possible for our world to be dominated by probabilities,” he said in a 2012 interview with the journal Public Culture. “We now live in a universe of chance, and everything we do — health, sports, sex, molecules, the climate — takes place within a discourse of probabilities.”
  • Whatever the subject, whatever the audience, one idea that pervades all his work is that “science is a human enterprise,” Ragnar Fjelland and Roger Strand of the University of Bergen in Norway wrote when Professor Hacking won the Holberg Prize. “It is always created in a historical situation, and to understand why present science is as it is, it is not sufficient to know that it is ‘true,’ or confirmed. We have to know the historical context of its emergence.”
  • Hacking often argued that as the human sciences have evolved, they have created categories of people, and that people have subsequently defined themselves as falling into those categories. Thus does human reality become socially constructed.
  • In 2000, he became the first Anglophone to win a permanent position at the Collège de France, where he held the chair in the philosophy and history of scientific concepts until he retired in 2006.
  • “I call this the ‘looping effect,’” he added. “Sometimes, our sciences create kinds of people that in a certain sense did not exist before.”
  • In “Why Race Still Matters,” a 2005 article in the journal Daedalus, he explored how anthropologists developed racial categories by extrapolating from superficial physical characteristics, with lasting effects — including racial oppression. “Classification and judgment are seldom separable,” he wrote. “Racial classification is evaluation.”
  • Similarly, he once wrote, in the field of mental health the word “normal” “uses a power as old as Aristotle to bridge the fact/value distinction, whispering in your ear that what is normal is also right.”
  • In his influential writings about autism, Professor Hacking charted the evolution of the diagnosis and its profound effects on those diagnosed, which in turn broadened the definition to include a greater number of people.
  • Encouraging children with autism to think of themselves that way “can separate the child from ‘normalcy’ in a way that is not appropriate,” he told Public Culture. “By all means encourage the oddities. By no means criticize the oddities.”
  • His emphasis on historical context also illuminated what he called transient mental illnesses, which appear to be so confined 0cto their time 0c 0cthat they can vanish when times change.
  • “hysterical fugue” was a short-lived epidemic of compulsive wandering that emerged in Europe in the 1880s, largely among middle-class men who had become transfixed by stories of exotic locales and the lure of trave
  • His intellectual tendencies were unmistakable from an early age. “When he was 3 or 4 years old, he would sit and read the dictionary,” Jane Hacking said. “His parents were completely baffled.”
  • He wondered aloud, the interviewer noted, if the whole universe was governed by nonlocality — if “everything in the universe is aware of everything else.”“That’s what you should be writing about,” he said. “Not me. I’m a dilettante. My governing word is ‘curiosity.’”
Javier E

What Did Twitter Turn Us Into? - The Atlantic - 0 views

  • The bedlam of Twitter, fused with the brevity of its form, offers an interpretation of the virtual town square as a bustling, modernist city.
  • It’s easy to get stuck in a feedback loop: That which appears on Twitter is current (if not always true), and what’s current is meaningful, and what’s meaningful demands contending with. And so, matters that matter little or not at all gain traction by virtue of the fact that they found enough initial friction to start moving.
  • The platform is optimized to make the nonevent of its own exaggerated demise seem significant.
  • ...9 more annotations...
  • the very existence of tweets about an event can make that event seem newsworthy—by virtue of having garnered tweets. This supposed newsworthiness can then result in literal news stories, written by journalists and based on inspiration or sourcing from tweets themselves, or it can entail the further spread of a tweet’s message by on-platform engagement, such as likes and quote tweets. Either way, the nature of Twitter is to assert the importance of tweets.
  • Tweets appear more meaningful when amplified, and when amplified they inspire more tweets in the same vein. A thing becomes “tweetworthy” when it spreads but then also justifies its value both on and beyond Twitter by virtue of having spread. This is the “famous for being famous” effect
  • This propensity is not unique to Twitter—all social media possesses it. But the frequency and quantity of posts on Twitter, along with their brevity, their focus on text, and their tendency to be vectors of news, official or not, make Twitter a particularly effective amplification house of mirrors
  • At least in theory. In practice, Twitter is more like an asylum, inmates screaming at everyone and no one in particular, histrionics displacing reason, posters posting at all costs because posting is all that is possible
  • Twitter shapes an epistemology for users under its thrall. What can be known, and how, becomes infected by what has, or can, be tweeted.
  • Producers of supposedly actual news see the world through tweet-colored glasses, by transforming tweets’ hypothetical status as news into published news—which produces more tweeting in turn.
  • For them, and others on this website, it has become an awful habit. Habits feel normal and even justified because they are familiar, not because they are righteous.
  • Twitter convinced us that it mattered, that it was the world’s news service, or a vector for hashtag activism, or a host for communities without voices, or a mouthpiece for the little gal or guy. It is those things, sometimes, for some of its users. But first, and mostly, it is a habit.
  • We never really tweeted to say something. We tweeted because Twitter offered a format for having something to say, over and over again. Just as the purpose of terrorism is terror, so the purpose of Twitter is tweeting.
Javier E

Everyone's Over Instagram - The Atlantic - 0 views

  • “Gen Z’s relationship with Instagram is much like millennials’ relationship with Facebook: Begrudgingly necessary,” Casey Lewis, a youth-culture consultant who writes the youth-culture newsletter After School, told me over email. “They don’t want to be on it, but they feel it’s weird if they’re not.”
  • a recent Piper Sandler survey found that, of 14,500 teens surveyed across 47 states, only 20 percent named Instagram their favorite social-media platform (TikTok came first, followed by Snapchat).
  • Simply being on Instagram is a very different thing from actively engaging with it. Participating means throwing pictures into a void, which is why it’s become kind of cringe. To do so earnestly suggests a blithe unawareness of your surroundings, like shouting into the phone in public.
  • ...10 more annotations...
  • In other words, Instagram is giving us the ick: that feeling when a romantic partner or crush does something small but noticeable—like wearing a fedora—that immediately turns you off forever.
  • “People who aren’t influencers only use [Instagram] to watch other people make big announcements,” Lee Tilghman, a former full-time Instagram influencer, told me over the phone. “My close friends who aren’t influencers, they haven’t posted in, like, two years.”
  • although Instagram now has 2 billion monthly users, it faces an existential problem: What happens when the 18-to-29-year-olds who are most likely to use the app, at least in America, age out or go elsewhere? Last year, The New York Times reported that Instagram was privately worried about attracting and retaining the new young users that would sustain its long-term growth—not to mention whose growing shopping potential is catnip to advertisers.
  • Over the summer, these frustrations boiled over. An update that promised, among other things, algorithmically recommended video content that would fill the entire screen was a bridge too far. Users were fed up with watching the app contort itself into a TikTok copycat that prioritized video and recommended posts over photos from friends
  • . Internal documents obtained by The Wall Street Journal show that Instagram users spend 17.6 million hours a day watching Reels, Instagram’s TikTok knockoff, compared with the 197.8 million hours people spend watching TikTok every day. The documents also revealed that Reels engagement has declined by 13.6 percent in recent months, with most users generating “no engagement whatsoever.”
  • Instagram may not be on its deathbed, but its transformation from cool to cringe is a sea change in the social-media universe. The platform was perhaps the most significant among an old generation of popular apps that embodied the original purpose of social media: to connect online with friends and family. Its decline is about not just a loss of relevance, but a capitulation to a new era of “performance” media, in which we create online primarily to reach people we don’t know instead of the people we do
  • . Lavish brand deals, in which an influencer promotes a brand’s product to their audience for a fee, have been known to pay anywhere from $100 to $10,000 per post, depending on the size of the creator’s following and their engagement. Now Tilghman, who became an Instagram influencer in 2015 and at one point had close to 400,000 followers, says she’s seen her rate go down by 80 percent over the past five years. The market’s just oversaturated.
  • The author Jessica DeFino, who joined Instagram in 2018 on the advice of publishing agents, similarly began stepping back from the platform in 2020, feeling overwhelmed by the constant feedback of her following. She has now set up auto-replies to her Instagram DMs: If one of her 59,000 followers sends her a message, they’re met with an invitation to instead reach out to DeFino via email.
  • would she get back on Instagram as a regular user? Only if she “created a private, personal account — somewhere I could limit my interactions to just family and friends,” she says. “Like what Instagram was in the beginning, I guess.”
  • That is if, by then, Instagram’s algorithm-driven, recommendation-fueled, shopping-heavy interface would even let her. Ick.
Javier E

The New History Wars - The Atlantic - 0 views

  • Critical historians who thought they were winning the fight for control within the academy now face dire retaliation from outside the academy. The dizzying turn from seeming triumph in 2020 to imminent threat in 2022 has unnerved many practitioners of the new history. Against this background, they did not welcome it when their association’s president suggested that maybe their opponents had a smidgen of a point.
  • a background reality of the humanities in the contemporary academy: a struggle over who is entitled to speak about what. Nowhere does this struggle rage more fiercely than in anything to do with the continent of Africa. Who should speak? What may be said? Who will be hired?
  • ne obvious escape route from the generational divide in the academy—and the way the different approaches to history, presentist and antiquarian, tend to map onto it—is for some people, especially those on the older and whiter side of the divide, to keep their mouths shut about sensitive issues
  • ...15 more annotations...
  • The political and methodological stresses within the historical profession are intensified by economic troubles. For a long time, but especially since the economic crisis of 2008, university students have turned away from the humanities, preferring to major in fields that seem to offer more certain and lucrative employment. Consequently, academic jobs in the humanities and especially in history have become radically more precarious for younger faculty—even as universities have sought to meet diversity goals in their next-generation hiring by expanding offerings in history-adjacent specialties, such as gender and ethnic studies.
  • The result has produced a generational divide. Younger scholars feel oppressed and exploited by universities pressing them to do more labor for worse pay with less security than their elders; older scholars feel that overeager juniors are poised to pounce on the least infraction as an occasion to end an elder’s career and seize a job opening for themselves. Add racial difference as an accelerant, and what was intended as an interesting methodological discussion in a faculty newsletter can explode into a national culture war.
  • One of the greatest American Africanists was the late Philip Curtin. He wrote one of the first attempts to tally the exact number of persons trafficked by the transatlantic slave trade. Upon publication in 1972, his book was acclaimed as a truly pioneering work of history. By 1995, however, he was moved to protest against trends in the discipline at that time in an article in the Chronicle of Higher Education:I am troubled by increasing evidence of the use of racial criteria in filling faculty posts in the field of African history … This form of intellectual apartheid has been around for several decades, but it appears to have become much more serious in the past few years, to the extent that white scholars trained in African history now have a hard time finding jobs.
  • Much of academia is governed these days by a joke from the Soviet Union: “If you think it, don’t speak it. If you speak it, don’t write it. If you write it, don’t sign it. But if you do think it, speak it, write it, and sign it—don’t be surprised.”
  • Yet this silence has consequences, too. One of the most unsettling is the displacement of history by mythmaking
  • mythmaking is spreading from “just the movies” to more formal and institutional forms of public memory. If old heroes “must fall,” their disappearance opens voids for new heroes to be inserted in their place—and that insertion sometimes requires that new history be fabricated altogether, the “bad history” that Sweet tried to warn against.
  • If it is not the job of the president of the American Historical Association to confront those questions, then whose is it?
  • Sweet used a play on words—“Is History History?”—for the title of his complacency-shaking essay. But he was asking not whether history is finished, done with, but Is history still history? Is it continuing to do what history is supposed to do? Or is it being annexed for other purposes, ideological rather than historical ones?
  • Advocates of studying the more distant past to disturb and challenge our ideas about the present may accuse their academic rivals of “presentism.”
  • In real life, of course, almost everybody who cares about history believes in a little of each option. But how much of each? What’s the right balance? That’s the kind of thing that historians do argue about, and in the arguing, they have developed some dismissive labels for one another
  • Those who look to the more recent past to guide the future may accuse the other camp of “antiquarianism.”
  • The accusation of presentism hurts because it implies that the historian is sacrificing scholarly objectivity for ideological or political purposes. The accusation of antiquarianism stings because it implies that the historian is burrowing into the dust for no useful purpose at all.
  • In his mind, he was merely reopening one of the most familiar debates in professional history: the debate over why? What is the value of studying the past? To reduce the many available answers to a stark choice: Should we study the more distant past to explore its strangeness—and thereby jolt ourselves out of easy assumptions that the world we know is the only possible one?
  • Or should we study the more recent past to understand how our world came into being—and thereby learn some lessons for shaping the future?
  • The August edition of the association’s monthly magazine featured, as usual, a short essay by the association’s president, James H. Sweet, a professor at the University of Wisconsin at Madison. Within hours of its publication, an outrage volcano erupted on social media. A professor at Cornell vented about the author’s “white gaze.”
Javier E

Elon Musk's Texts Shatter the Myth of the Tech Genius - The Atlantic - 0 views

  • The texts also cast a harsh light on the investment tactics of Silicon Valley’s best and brightest. There’s Calacanis’s overeager angel-investing pitches, and then you have the
  • “This is one of the most telling things I’ve ever seen about how investing works in Silicon Valley,” Jessica Lessin, the founder of the tech publication The Information, tweeted of the Andreessen exchange. Indeed, both examples from the document offer a look at the boys’ club and power networks of the tech world in action.
  • the eagerness to pony up for Musk and the lazy quality of this dealmaking reveal something deeper about the brokenness of this investment ecosystem and the ways that it is driven more by vibes and grievances than due diligence.
  • ...3 more annotations...
  • For this crew, the early success of their past companies or careers is usually prologue, and their skills will, of course, transfer to any area they choose to conquer (including magically solving free speech). But what they are actually doing is winging it.
  • There is a tendency, especially when it comes to the über-rich and powerful, to assume and to fantasize about what we can’t see. We ascribe shadowy brilliance or malevolence, which may very well be unearned or misguided.
  • What’s striking about the Musk messages, then, is the similarity between these men’s behavior behind closed doors and in public on Twitter. Perhaps the real revelation here is that the shallowness you see is the shallowness you get.
Javier E

Scientists Can No Longer Ignore Ancient Flooding Tales - The Atlantic - 0 views

  • It wasn’t long after Henry David Inglis arrived on the island of Jersey, just northwest of France, that he heard the old story. Locals eagerly told the 19th-century Scottish travel writer how, in a bygone age, their island had been much more substantial, and that folks used to walk to the French coast. The only hurdle to their journey was a river—one easily crossed using a short bridge.
  • there had been a flood. A big one. Between roughly 15,000 and 6,000 years ago, massive flooding caused by melting glaciers raised sea levels around Europe. That flooding is what eventually turned Jersey into an island.
  • Rather than being a ridiculous claim not worthy of examination, perhaps the old story was true—a whisper from ancestors who really did walk through now-vanished lands
  • ...8 more annotations...
  • That’s exactly what the geographer Patrick Nunn and the historian Margaret Cook at the University of the Sunshine Coast in Australia have proposed in a recent paper.
  • In their work, the pair describe colorful legends from northern Europe and Australia that depict rising waters, peninsulas becoming islands, and receding coastlines during that period of deglaciation thousands of years ago. Some of these stories, the researchers say, capture historical sea-level rise that actually happened—often several thousand years ago. For scholars of oral history, that makes them geomyths.
  • “The first time I read an Aboriginal story from Australia that seemed to recall the rise of sea levels after the last ice age, I thought, No, I don’t think this is correct,” Nunn says. “But then I read another story that recalled the same thing.
  • For Jo Brendryen, a paleoclimatologist at the University of Bergen in Norway who has studied the effects of deglaciation in Europe following the end of the last ice age, the idea that traditional oral histories preserve real accounts of sea-level rise is perfectly plausible.
  • During the last ice age, he says, the sudden melting of ice sheets induced catastrophic events known as meltwater pulses, which caused sudden and extreme sea-level rise. Along some coastlines in Europe, the ocean may have risen as much as 10 meters in just 200 years. At such a pace, it would have been noticeable to people across just a few human generations.
  • “These are stories based in trauma, based in catastrophe.”
  • That, he suggests, is why it may have made sense for successive generations to pass on tales of geological upheaval. Ancient societies may have sought to broadcast their warning: Beware, these things can happen!
  • the old stories still have things to teach us. As Nunn says, “The fact that our ancestors have survived those periods gives us hope that we can survive this.”
Javier E

A Commencement Address Too Honest to Deliver in Person - The Atlantic - 0 views

  • Use this hiatus to do something you would never have done if this emergency hadn’t hit. When the lockdown lifts, move to another state or country. Take some job that never would have made sense if you were worrying about building a career—bartender, handyman, AmeriCorps volunteer.
  • If you use the next two years as a random hiatus, you may not wind up richer, but you’ll wind up more interesting.
  • The biggest way most colleges fail is this: They don’t plant the intellectual and moral seeds students are going to need later, when they get hit by the vicissitudes of life.
  • ...13 more annotations...
  • If you didn’t study Jane Austen while you were here, you probably lack the capacity to think clearly about making a marriage decision. If you didn’t read George Eliot, then you missed a master class on how to judge people’s character. If you didn’t read Nietzsche, you are probably unprepared to handle the complexities of atheism—and if you didn’t read Augustine and Kierkegaard, you’re probably unprepared to handle the complexities of faith.
  • The list goes on. If you didn’t read de Tocqueville, you probably don’t understand your own country. If you didn’t study Gibbon, you probably lack the vocabulary to describe the rise and fall of cultures and nations.
  • The wisdom of the ages is your inheritance; it can make your life easier. These resources often fail to get shared because universities are too careerist, or because faculty members are more interested in their academic specialties or politics than in teaching undergraduates, or because of a host of other reasons
  • What are you putting into your mind? Our culture spends a lot less time worrying about this, and when it does, it goes about it all wrong.
  • my worry is that, especially now that you’re out of college, you won’t put enough really excellent stuff into your brain.
  • I worry that it’s possible to grow up now not even aware that those upper registers of human feeling and thought exist.
  • The theory of maximum taste says that each person’s mind is defined by its upper limit—the best that it habitually consumes and is capable of consuming.
  • After college, most of us resolve to keep doing this kind of thing, but we’re busy and our brains are tired at the end of the day. Months and years go by. We get caught up in stuff, settle for consuming Twitter and, frankly, journalism. Our maximum taste shrinks.
  • I’m worried about the future of your maximum taste. People in my and earlier generations, at least those lucky enough to get a college education, got some exposure to the classics, which lit a fire that gets rekindled every time we sit down to read something really excellent.
  • the “theory of maximum taste.” This theory is based on the idea that exposure to genius has the power to expand your consciousness. If you spend a lot of time with genius, your mind will end up bigger and broader than if you spend your time only with run-of-the-mill stuff.
  • the whole culture is eroding the skill the UCLA scholar Maryanne Wolf calls “deep literacy,” the ability to deeply engage in a dialectical way with a text or piece of philosophy, literature, or art.
  • “To the extent that you cannot perceive the world in its fullness, to the same extent you will fall back into mindless, repetitive, self-reinforcing behavior, unable to escape.”
  • I can’t say that to you, because it sounds fussy and elitist and OK Boomer. And if you were in front of me, you’d roll your eyes.
  •  
    Or as the neurologist Richard Cytowic put it to Adam Garfinkle, "To the extent that you cannot perceive the world in its fullness, to the same extent you will fall back into mindless, repetitive, self-reinforcing behavior, unable to escape."*
Javier E

How the Shoggoth Meme Has Come to Symbolize the State of A.I. - The New York Times - 0 views

  • the Shoggoth had become a popular reference among workers in artificial intelligence, as a vivid visual metaphor for how a large language model (the type of A.I. system that powers ChatGPT and other chatbots) actually works.
  • it was only partly a joke, he said, because it also hinted at the anxieties many researchers and engineers have about the tools they’re building.
  • Since then, the Shoggoth has gone viral, or as viral as it’s possible to go in the small world of hyper-online A.I. insiders. It’s a popular meme on A.I. Twitter (including a now-deleted tweet by Elon Musk), a recurring metaphor in essays and message board posts about A.I. risk, and a bit of useful shorthand in conversations with A.I. safety experts. One A.I. start-up, NovelAI, said it recently named a cluster of computers “Shoggy” in homage to the meme. Another A.I. company, Scale AI, designed a line of tote bags featuring the Shoggoth.
  • ...17 more annotations...
  • Most A.I. researchers agree that models trained using R.L.H.F. are better behaved than models without it. But some argue that fine-tuning a language model this way doesn’t actually make the underlying model less weird and inscrutable. In their view, it’s just a flimsy, friendly mask that obscures the mysterious beast underneath.
  • In a nutshell, the joke was that in order to prevent A.I. language models from behaving in scary and dangerous ways, A.I. companies have had to train them to act polite and harmless. One popular way to do this is called “reinforcement learning from human feedback,” or R.L.H.F., a process that involves asking humans to score chatbot responses, and feeding those scores back into the A.I. model.
  • Shoggoths are fictional creatures, introduced by the science fiction author H.P. Lovecraft in his 1936 novella “At the Mountains of Madness.” In Lovecraft’s telling, Shoggoths were massive, blob-like monsters made out of iridescent black goo, covered in tentacles and eyes.
  • @TetraspaceWest said, wasn’t necessarily implying that it was evil or sentient, just that its true nature might be unknowable.
  • And it reinforces the notion that what’s happening in A.I. today feels, to some of its participants, more like an act of summoning than a software development process. They are creating the blobby, alien Shoggoths, making them bigger and more powerful, and hoping that there are enough smiley faces to cover the scary parts.
  • “I was also thinking about how Lovecraft’s most powerful entities are dangerous — not because they don’t like humans, but because they’re indifferent and their priorities are totally alien to us and don’t involve humans, which is what I think will be true about possible future powerful A.I.”
  • when Bing’s chatbot became unhinged and tried to break up my marriage, an A.I. researcher I know congratulated me on “glimpsing the Shoggoth.” A fellow A.I. journalist joked that when it came to fine-tuning Bing, Microsoft had forgotten to put on its smiley-face mask.
  • @TetraspaceWest, the meme’s creator, told me in a Twitter message that the Shoggoth “represents something that thinks in a way that humans don’t understand and that’s totally different from the way that humans think.”
  • In any case, the Shoggoth is a potent metaphor that encapsulates one of the most bizarre facts about the A.I. world, which is that many of the people working on this technology are somewhat mystified by their own creations. They don’t fully understand the inner workings of A.I. language models, how they acquire new capabilities or why they behave unpredictably at times. They aren’t totally sure if A.I. is going to be net-good or net-bad for the world.
  • That some A.I. insiders refer to their creations as Lovecraftian horrors, even as a joke, is unusual by historical standards. (Put it this way: Fifteen years ago, Mark Zuckerberg wasn’t going around comparing Facebook to Cthulhu.)
  • If it’s an A.I. safety researcher talking about the Shoggoth, maybe that person is passionate about preventing A.I. systems from displaying their true, Shoggoth-like nature.
  • A great many people are dismissive of suggestions that any of these systems are “really” thinking, because they’re “just” doing something banal (like making statistical predictions about the next word in a sentence). What they fail to appreciate is that there is every reason to suspect that human cognition is “just” doing those exact same things. It matters not that birds flap their wings but airliners don’t. Both fly. And these machines think. And, just as airliners fly faster and higher and farther than birds while carrying far more weight, these machines are already outthinking the majority of humans at the majority of tasks. Further, that machines aren’t perfect thinkers is about as relevant as the fact that air travel isn’t instantaneous. Now consider: we’re well past the Wright flyer level of thinking machine, past the early biplanes, somewhere about the first commercial airline level. Not quite the DC-10, I think. Can you imagine what the AI equivalent of a 777 will be like? Fasten your seatbelts.
  • @thomas h. You make my point perfectly. You’re observing that the way a plane flies — by using a turbine to generate thrust from combusting kerosene, for example — is nothing like the way that a bird flies, which is by using the energy from eating plant seeds to contract the muscles in its wings to make them flap. You are absolutely correct in that observation, but it’s also almost utterly irrelevant. And it ignores that, to a first approximation, there’s no difference in the physics you would use to describe a hawk riding a thermal and an airliner gliding (essentially) unpowered in its final descent to the runway. Further, you do yourself a grave disservice in being dismissive of the abilities of thinking machines, in exactly the same way that early skeptics have been dismissive of every new technology in all of human history. Writing would make people dumb; automobiles lacked the intelligence of horses; no computer could possibly beat a chess grandmaster because it can’t comprehend strategy; and on and on and on. Humans aren’t nearly as special as we fool ourselves into believing. If you want to have any hope of acting responsibly in the age of intelligent machines, you’ll have to accept that, like it or not, and whether or not it fits with your preconceived notions of what thinking is and how it is or should be done … machines can and do think, many of them better than you in a great many ways. b&
  • @BLA. You are incorrect. Everything has nature. Its nature is manifested in making humans react. Sure, no humans, no nature, but here we are. The writer and various sources are not attributing nature to AI so much as admitting that they don’t know what this nature might be, and there are reasons to be scared of it. More concerning to me is the idea that this field is resorting to geek culture reference points to explain and comprehend itself. It’s not so much the algorithm has no soul, but that the souls of the humans making it possible are stupendously and tragically underdeveloped.
  • When even tech companies are saying AI is moving too fast, and the articles land on page 1 of the NYT (there's an old reference), I think the greedy will not think twice about exploiting this technology, with no ethical considerations, at all.
  • @nome sane? The problem is it isn't data as we understand it. We know what the datasets are -- they were used to train the AI's. But once trained, the AI is thinking for itself, with results that have surprised everybody.
  • The unique feature of a shoggoth is it can become whatever is needed for a particular job. There's no actual shape so it's not a bad metaphor, if an imperfect image. Shoghoths also turned upon and destroyed their creators, so the cautionary metaphor is in there, too. A shame more Asimov wasn't baked into AI. But then the conflict about how to handle AI in relation to people was key to those stories, too.
« First ‹ Previous 541 - 560 of 582 Next › Last »
Showing 20 items per page