Skip to main content

Home/ TOK Friends/ Group items tagged humor

Rss Feed Group items tagged

Javier E

What Do We Lose If We Lose Twitter? - The Atlantic - 0 views

  • What do we lose if we lose Twitter?
  • At its best, Twitter can still provide that magic of discovering a niche expert or elevating a necessary, insurgent voice, but there is far more noise than signal. Plenty of those overenthusiastic voices, brilliant thinkers, and influential accounts have burned out on culture-warring, or have been harassed off the site or into lurking.
  • many of the most hyperactive, influential twitterati (cringe) of the mid-2010s have built up large audiences and only broadcast now: They don’t read their mentions, and they rarely engage. In private conversations, some of those people have expressed a desire to see Musk torpedo the site and put a legion of posters out of their misery.
  • ...15 more annotations...
  • Perhaps the best example of what Twitter offers now—and what we stand to gain or lose from its demise—is illustrated by the path charted by public-health officials, epidemiologists, doctors, and nurses over the past three years.
  • They offered guidance that a flailing government response was too slow to provide, and helped cobble together an epidemiological picture of infections and case counts. At a moment when people were terrified and looking for any information at all, Twitter seemed to offer a steady stream of knowledgeable, diligent experts.
  • But Twitter does another thing quite well, and that’s crushing users with the pressures of algorithmic rewards and all of the risks, exposure, and toxicity that come with virality
  • t imagining a world without it can feel impossible. What do our politics look like without the strange feedback loop of a Twitter-addled political press and a class of lawmakers that seems to govern more via shitposting than by legislation
  • What happens if the media lose what the writer Max Read recently described as a “way of representing reality, and locating yourself within it”? The answer is probably messy.
  • here’s the worry that, absent a distributed central nervous system like Twitter, “the collective worldview of the ‘media’ would instead be over-shaped, from the top down, by the experiences and biases of wealthy publishers, careerist editors, self-loathing journalists, and canny operators operating in relatively closed social and professional circles.”
  • Twitter is, by some standards, a niche platform, far smaller than Facebook or Instagram or TikTok. The internet will evolve or mutate around a need for it. I am aware that all of us who can’t quit the site will simply move on when we have to.
  • Many of the past decade’s most polarizing and influential figures—people such as Donald Trump and Musk himself, who captured attention, accumulated power, and fractured parts of our public consciousness—were also the ones who were thought to be “good” at using the website.
  • the effects of Twitter’s chief innovation—its character limit—on our understanding of language, nuance, and even truth.
  • “These days, it seems like we are having languages imposed on us,” he said. “The fact that you have a social media that tells you how many characters to use, this is language imposition. You have to wonder about the agenda there. Why does anyone want to restrict the full range of my language? What’s the game there?
  • in McLuhanian fashion, the constraints and the architecture change not only what messages we receive but how we choose to respond. Often that choice is to behave like the platform itself: We are quicker to respond and more aggressive than we might be elsewhere, with a mindset toward engagement and visibility
  • it’s easy to argue that we stand to gain something essential and human if we lose Twitter. But there is plenty about Twitter that is also essential and human.
  • No other tool has connected me to the world—to random bits of news, knowledge, absurdist humor, activism, and expertise, and to scores of real personal interactions—like Twitter has
  • What makes evaluating a life beyond Twitter so hard is that everything that makes the service truly special is also what makes it interminable and toxic.
  • the worst experience you can have on the platform is to “win” and go viral. Generally, it seems that the more successful a person is at using Twitter, the more they refer to it as a hellsite.
Javier E

Instagram's Algorithm Delivers Toxic Video Mix to Adults Who Follow Children - WSJ - 0 views

  • Instagram’s Reels video service is designed to show users streams of short videos on topics the system decides will interest them, such as sports, fashion or humor. 
  • The Meta Platforms META -1.04%decrease; red down pointing triangle-owned social app does the same thing for users its algorithm decides might have a prurient interest in children, testing by The Wall Street Journal showed.
  • The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform.
  • ...30 more annotations...
  • “Our systems are effective at reducing harmful content, and we’ve invested billions in safety, security and brand suitability solutions,” said Samantha Stetson, a Meta vice president who handles relations with the advertising industry. She said the prevalence of inappropriate content on Instagram is low, and that the company invests heavily in reducing it.
  • The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults
  • The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.
  • The Canadian Centre for Child Protection, a child-protection group, separately ran similar tests on its own, with similar results.
  • Meta said the Journal’s tests produced a manufactured experience that doesn’t represent what billions of users see. The company declined to comment on why the algorithms compiled streams of separate videos showing children, sex and advertisements, but a spokesman said that in October it introduced new brand safety tools that give advertisers greater control over where their ads appear, and that Instagram either removes or reduces the prominence of four million videos suspected of violating its standards each month. 
  • The Journal reported in June that algorithms run by Meta, which owns both Facebook and Instagram, connect large communities of users interested in pedophilic content. The Meta spokesman said a task force set up after the Journal’s article has expanded its automated systems for detecting users who behave suspiciously, taking down tens of thousands of such accounts each month. The company also is participating in a new industry coalition to share signs of potential child exploitation.
  • Following what it described as Meta’s unsatisfactory response to its complaints, Match began canceling Meta advertising for some of its apps, such as Tinder, in October. It has since halted all Reels advertising and stopped promoting its major brands on any of Meta’s platforms. “We have no desire to pay Meta to market our brands to predators or place our ads anywhere near this content,” said Match spokeswoman Justine Sacco.
  • Even before the 2020 launch of Reels, Meta employees understood that the product posed safety concerns, according to former employees.
  • Robbie McKay, a spokesman for Bumble, said it “would never intentionally advertise adjacent to inappropriate content,” and that the company is suspending its ads across Meta’s platforms.
  • Meta created Reels to compete with TikTok, the video-sharing platform owned by Beijing-based ByteDance. Both products feed users a nonstop succession of videos posted by others, and make money by inserting ads among them. Both companies’ algorithms show to a user videos the platforms calculate are most likely to keep that user engaged, based on his or her past viewing behavior
  • The Journal reporters set up the Instagram test accounts as adults on newly purchased devices and followed the gymnasts, cheerleaders and other young influencers. The tests showed that following only the young girls triggered Instagram to begin serving videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that ran after a video of a woman exposing her crotch. 
  • When the test accounts then followed some users who followed those same young people’s accounts, they yielded even more disturbing recommendations. The platform served a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.
  • Experts on algorithmic recommendation systems said the Journal’s tests showed that while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them.
  • Current and former Meta employees said in interviews that the tendency of Instagram algorithms to aggregate child sexualization content from across its platform was known internally to be a problem. Once Instagram pigeonholes a user as interested in any particular subject matter, they said, its recommendation systems are trained to push more related content to them.
  • Preventing the system from pushing noxious content to users interested in it, they said, requires significant changes to the recommendation algorithms that also drive engagement for normal users. Company documents reviewed by the Journal show that the company’s safety staffers are broadly barred from making changes to the platform that might reduce daily active users by any measurable amount.
  • The test accounts showed that advertisements were regularly added to the problematic Reels streams. Ads encouraging users to visit Disneyland for the holidays ran next to a video of an adult acting out having sex with her father, and another of a young woman in lingerie with fake blood dripping from her mouth. An ad for Hims ran shortly after a video depicting an apparently anguished woman in a sexual situation along with a link to what was described as “the full video.”
  • Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands.
  • Part of the problem is that automated enforcement systems have a harder time parsing video content than text or still images. Another difficulty arises from how Reels works: Rather than showing content shared by users’ friends, the way other parts of Instagram and Facebook often do, Reels promotes videos from sources they don’t follow
  • In an analysis conducted shortly before the introduction of Reels, Meta’s safety staff flagged the risk that the product would chain together videos of children and inappropriate content, according to two former staffers. Vaishnavi J, Meta’s former head of youth policy, described the safety review’s recommendation as: “Either we ramp up our content detection capabilities, or we don’t recommend any minor content,” meaning any videos of children.
  • At the time, TikTok was growing rapidly, drawing the attention of Instagram’s young users and the advertisers targeting them. Meta didn’t adopt either of the safety analysis’s recommendations at that time, according to J.
  • Stetson, Meta’s liaison with digital-ad buyers, disputed that Meta had neglected child safety concerns ahead of the product’s launch. “We tested Reels for nearly a year before releasing it widely, with a robust set of safety controls and measures,” she said. 
  • After initially struggling to maximize the revenue potential of its Reels product, Meta has improved how its algorithms recommend content and personalize video streams for users
  • Among the ads that appeared regularly in the Journal’s test accounts were those for “dating” apps and livestreaming platforms featuring adult nudity, massage parlors offering “happy endings” and artificial-intelligence chatbots built for cybersex. Meta’s rules are supposed to prohibit such ads.
  • The Journal informed Meta in August about the results of its testing. In the months since then, tests by both the Journal and the Canadian Centre for Child Protection show that the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere. 
  • As of mid-November, the center said Instagram is continuing to steadily recommend what the nonprofit described as “adults and children doing sexual posing.”
  • Meta hasn’t offered a timetable for resolving the problem or explained how in the future it would restrict the promotion of inappropriate content featuring children. 
  • The Journal’s test accounts found that the problem even affected Meta-related brands. Ads for the company’s WhatsApp encrypted chat service and Meta’s Ray-Ban Stories glasses appeared next to adult pornography. An ad for Lean In Girls, the young women’s empowerment nonprofit run by former Meta Chief Operating Officer Sheryl Sandberg, ran directly before a promotion for an adult sex-content creator who often appears in schoolgirl attire. Sandberg declined to comment. 
  • Through its own tests, the Canadian Centre for Child Protection concluded that Instagram was regularly serving videos and pictures of clothed children who also appear in the National Center for Missing and Exploited Children’s digital database of images and videos confirmed to be child abuse sexual material. The group said child abusers often use the images of the girls to advertise illegal content for sale in dark-web forums.
  • The nature of the content—sexualizing children without generally showing nudity—reflects the way that social media has changed online child sexual abuse, said Lianna McDonald, executive director for the Canadian center. The group has raised concerns about the ability of Meta’s algorithms to essentially recruit new members of online communities devoted to child sexual abuse, where links to illicit content in more private forums proliferate.
  • “Time and time again, we’ve seen recommendation algorithms drive users to discover and then spiral inside of these online child exploitation communities,” McDonald said, calling it disturbing that ads from major companies were subsidizing that process.
Javier E

Opinion | Where Have all the Adults in Children's Books Gone? - The New York Times - 0 views

  • Some might see the entrenchment of child-centeredness in children’s literature as reinforcing what some social critics consider a rising tide of narcissism in young people today. But to be fair: Such criticisms of youth transcend the ages.
  • What is certainly true now is the primacy of “mirrors and windows,” a philosophy that strives to show children characters who reflect how they look back to them, as well as those from different backgrounds, mostly with an eye to diversity.
  • This is a noble goal, but those mirrors and windows should apply to adults as well. Adults are, after all, central figures in children’s lives — their parents and caregivers, their teachers, their role models
  • ...7 more annotations...
  • . The implicit lesson is that grown-ups aren’t infallible. It’s OK to laugh at them and it’s OK to feel compassion for them and it’s even OK to feel sorry for them on occasion.
  • The adult figures in children’s literature are also frequently outsiders or eccentrics in some way, and quite often subject to ridicule
  • yes, adults are often the Other — which makes them a mystery and a curiosity. Literature offers insight into these occasionally intimidating creatures.
  • In real life, children revere adults and they fear them. It only follows, then, that they appreciate when adult characters behave admirably but also delight in seeing the consequences — especially when rendered with humor — when they don’t.
  • Nursery rhymes, folk tales, myths and legends overwhelmingly cast adults as their central characters — and have endured for good reason
  • In somewhat later tales, children investigated crimes alongside Sherlock Holmes, adventured through Narnia, inhabited Oz and traversed Middle-earth. Grown-up heroes can be hobbits, or rabbits (“Watership Down”), badgers or moles (“The Wind in the Willows”). Children join them no matter what because they like to be in league with their protagonists and by extension, their authors.
  • In children’s books with adult heroes, children get to conspire alongside their elders. Defying the too-often adversarial relationship between adults and children in literature, such books enable children to see that adults are perfectly capable of occupying their shared world with less antagonism — as partners in life, in love and in adventure.
« First ‹ Previous 41 - 43 of 43
Showing 20 items per page