Skip to main content

Home/ TOK Friends/ Group items tagged boys

Rss Feed Group items tagged

Javier E

Elliot Ackerman Went From U.S. Marine to Bestselling Novelist - WSJ - 0 views

  • Years before he impressed critics with his first novel, “Green on Blue” (2015), written from the perspective of an Afghan boy, Ackerman was already, in his words, “telling stories and inhabiting the minds of others.” He explains that much of his work as a special-operations officer involved trying to grasp what his adversaries were thinking, to better anticipate how they might act
  • “Look, I really believe in stories, I believe in art, I believe that this is how we express our humanity,” he says. “You can’t understand a society without understanding the stories they tell about themselves, and how these stories are constantly changing.”
  • his, in essence, is the subject of “Halcyon,” in which a scientific breakthrough allows Robert Ableson, a World War II hero and renowned lawyer, to come back from the dead. Yet the 21st-century America he returns to feels like a different place, riven by debates over everything from Civil War monuments to workplace misconduct.
  • ...9 more annotations...
  • The novel probes how nothing in life is fixed, including the legacies of the dead and the stories we tell about our pas
  • “The study of history shouldn’t be backward looking,” explains a historian in “Halcyon.” “To matter, it has to take us forward.”
  • Ackerman was in college on Sept. 11, 2001, but what he remembers more vividly is watching the premiere of the TV miniseries “Band of Brothers” the previous Sunday. “If you wanted to know the zeitgeist in the U.S. at the time, it was this very sentimental view of World War II,” he says. “There was this nostalgia for a time where we’re the good guys, they’re the bad guys, and we’re going to liberate oppressed people.”
  • Ackerman, who also covers wars and veteran affairs as a journalist, says that America’s backing of Ukraine is essential in the face of what he calls “an authoritarian axis rising up in the world, with China, Russia and Iran.” Were the country to offer similar help to Taiwan in the face of an invasion from China, he notes, having some air bases in nearby Afghanistan would help, but the U.S. gave those up in 2021.
  • With Islamic fundamentalists now in control of places where he lost friends, he says he is often asked if he regrets his service. “When you are a young man and your country goes to war, you’re presented with a choice: You either fight or you don’t,” he writes in his 2019 memoir “Places and Names.” “I don’t regret my choice, but maybe I regret being asked to choose.”
  • Serving in the military at a time when wars are no longer generation-defining events has proven alienating for Ackerman. “When you’ve got wars with an all-volunteer military funded through deficit spending, they can go on forever because there are no political costs
  • The catastrophic withdrawal from Afghanistan in 2021, which Ackerman covers in his recent memoir “The Fifth Act,” compounded this moral injury. “The fact that there has been so little government support for our Afghan allies has left it to vets to literally clean this up,” he says, noting that he still fields requests for help on WhatsApp. He adds that unless lawmakers act, the tens of thousands of Afghans currently living in the U.S. on humanitarian parole will be sent back to Taliban-held Afghanistan later this year: “It’s very painful to see how our allies are treated.”
  • Looking back on America’s misadventures in Iraq, Afghanistan and elsewhere, he notes that “the stories we tell about war are really important to the decisions we make around war. It’s one reason why storytelling fills me with a similar sense of purpose.”
  • “We don’t talk about the world and our place in it in a holistic way, or a strategic way,” Ackerman says. “We were telling a story about ending America’s longest war, when the one we should’ve been telling was about repositioning ourselves in a world that’s becoming much more dangerous,” he adds. “Our stories sometimes get us in trouble, and we’re still dealing with that trouble today.”
Javier E

Opinion | Transgender biology debates should focus on the brain, not the body - The Was... - 0 views

  • what, then, is a biological male, or female? What determines this supposedly simple truth? It’s about chromosomes, right?
  • The study found that adolescent boys and girls who described themselves as trans responded like the peers of their perceived gender.
  • It may be that what’s in your pants is less important than what’s between your ears.
  • ...11 more annotations...
  • What the research has found is that the brains of trans people are unique: neither female nor male, exactly, but something distinct.
  • But what does that mean, a male brain, or a female brain, or even a transgender one?
  • It’s a fraught topic, because brains are a collection of characteristics, rather than a binary classification of either/or
  • yet scientists continue to study the brain in hopes of understanding whether a sense of the gendered self can, at least in part, be the result of neurology
  • Well, not entirely. Because not every person with a Y chromosome is male, and not every person with a double X is female. The world is full of people with other combinations: XXY (or Klinefelter Syndrome), XXX (or Trisomy X), XXXY, and so on. There’s even something called Androgen Insensitivity Syndrome, a condition that keeps the brains of people with a Y from absorbing the information in that chromosome. Most of these people develop as female, and may not even know about their condition until puberty — or even later.
  • t there’s a problem with using neurology as an argument for trans acceptance — it suggests that, on some level, there is something wrong with transgender people, that we are who we are as a result of a sickness or a biological hiccup.
  • trying to open people’s hearts by saying “Check out my brain!” can do more harm than good, because this line of argument delegitimizes the experiences of many trans folks. It suggests that there’s only one way to be trans — to feel trapped in the wrong body, to go through transition, and to wind up, when all is said and done, on the opposite-gender pole. It suggests that the quest trans people go on can only be considered successful if it ends with fitting into the very society that rejected us in the first place.
  • All the science tells us, in the end, is that a biological male — or female — is not any one thing, but a collection of possibilities.
  • No one who embarks upon a life as a trans person in this country is doing so out of caprice, or a whim, or a delusion. We are living these wondrous and perilous lives for one reason only — because our hearts demand it.
  • what we need now is not new legislation to make things harder. What we need now is understanding, not cruelty. What we need now is not hatred, but love.
  • the important thing is not that they feel like a woman, or a man, or something else. What matters most is the plaintive desire, to be free to feel the way I feel.
Javier E

His Job Was to Make Instagram Safe for Teens. His 14-Year-Old Showed Him What the App W... - 0 views

  • The experience of young users on Meta’s Instagram—where Bejar had spent the previous two years working as a consultant—was especially acute. In a subsequent email to Instagram head Adam Mosseri, one statistic stood out: One in eight users under the age of 16 said they had experienced unwanted sexual advances on the platform over the previous seven days.
  • For Bejar, that finding was hardly a surprise. His daughter and her friends had been receiving unsolicited penis pictures and other forms of harassment on the platform since the age of 14, he wrote, and Meta’s systems generally ignored their reports—or responded by saying that the harassment didn’t violate platform rules.
  • “I asked her why boys keep doing that,” Bejar wrote to Zuckerberg and his top lieutenants. “She said if the only thing that happens is they get blocked, why wouldn’t they?”
  • ...39 more annotations...
  • For the well-being of its users, Bejar argued, Meta needed to change course, focusing less on a flawed system of rules-based policing and more on addressing such bad experiences
  • The company would need to collect data on what upset users and then work to combat the source of it, nudging those who made others uncomfortable to improve their behavior and isolating communities of users who deliberately sought to harm others.
  • “I am appealing to you because I believe that working this way will require a culture shift,” Bejar wrote to Zuckerberg—the company would have to acknowledge that its existing approach to governing Facebook and Instagram wasn’t working.
  • During and after Bejar’s time as a consultant, Meta spokesman Andy Stone said, the company has rolled out several product features meant to address some of the Well-Being Team’s findings. Those features include warnings to users before they post comments that Meta’s automated systems flag as potentially offensive, and reminders to be kind when sending direct messages to users like content creators who receive a large volume of messages. 
  • Meta’s classifiers were reliable enough to remove only a low single-digit percentage of hate speech with any degree of precision.
  • Bejar was floored—all the more so when he learned that virtually all of his daughter’s friends had been subjected to similar harassment. “DTF?” a user they’d never met would ask, using shorthand for a vulgar proposition. Instagram acted so rarely on reports of such behavior that the girls no longer bothered reporting them. 
  • Meta’s own statistics suggested that big problems didn’t exist. 
  • Meta had come to approach governing user behavior as an overwhelmingly automated process. Engineers would compile data sets of unacceptable content—things like terrorism, pornography, bullying or “excessive gore”—and then train machine-learning models to screen future content for similar material.
  • While users could still flag things that upset them, Meta shifted resources away from reviewing them. To discourage users from filing reports, internal documents from 2019 show, Meta added steps to the reporting process. Meta said the changes were meant to discourage frivolous reports and educate users about platform rules. 
  • The outperformance of Meta’s automated enforcement relied on what Bejar considered two sleights of hand. The systems didn’t catch anywhere near the majority of banned content—only the majority of what the company ultimately removed
  • “Please don’t talk about my underage tits,” Bejar’s daughter shot back before reporting his comment to Instagram. A few days later, the platform got back to her: The insult didn’t violate its community guidelines.
  • Also buttressing Meta’s statistics were rules written narrowly enough to ban only unambiguously vile material. Meta’s rules didn’t clearly prohibit adults from flooding the comments section on a teenager’s posts with kiss emojis or posting pictures of kids in their underwear, inviting their followers to “see more” in a private Facebook Messenger group. 
  • “Mark personally values freedom of expression first and foremost and would say this is a feature and not a bug,” Rosen responded
  • Narrow rules and unreliable automated enforcement systems left a lot of room for bad behavior—but they made the company’s child-safety statistics look pretty good according to Meta’s metric of choice: prevalence.
  • Defined as the percentage of content viewed worldwide that explicitly violates a Meta rule, prevalence was the company’s preferred measuring stick for the problems users experienced.
  • According to prevalence, child exploitation was so rare on the platform that it couldn’t be reliably estimated, less than 0.05%, the threshold for functional measurement. Content deemed to encourage self-harm, such as eating disorders, was just as minimal, and rule violations for bullying and harassment occurred in just eight of 10,000 views. 
  • “There’s a grading-your-own-homework problem,”
  • Meta defines what constitutes harmful content, so it shapes the discussion of how successful it is at dealing with it.”
  • It could reconsider its AI-generated “beauty filters,” which internal research suggested made both the people who used them and those who viewed the images more self-critical
  • the team built a new questionnaire called BEEF, short for “Bad Emotional Experience Feedback.
  • A recurring survey of issues 238,000 users had experienced over the past seven days, the effort identified problems with prevalence from the start: Users were 100 times more likely to tell Instagram they’d witnessed bullying in the last week than Meta’s bullying-prevalence statistics indicated they should.
  • “People feel like they’re having a bad experience or they don’t,” one presentation on BEEF noted. “Their perception isn’t constrained by policy.
  • they seemed particularly common among teens on Instagram.
  • Among users under the age of 16, 26% recalled having a bad experience in the last week due to witnessing hostility against someone based on their race, religion or identity
  • More than a fifth felt worse about themselves after viewing others’ posts, and 13% had experienced unwanted sexual advances in the past seven days. 
  • The vast gap between the low prevalence of content deemed problematic in the company’s own statistics and what users told the company they experienced suggested that Meta’s definitions were off, Bejar argued
  • To minimize content that teenagers told researchers made them feel bad about themselves, Instagram could cap how much beauty- and fashion-influencer content users saw.
  • Proving to Meta’s leadership that the company’s prevalence metrics were missing the point was going to require data the company didn’t have. So Bejar and a group of staffers from the Well-Being Team started collecting it
  • And it could build ways for users to report unwanted contacts, the first step to figuring out how to discourage them.
  • One experiment run in response to BEEF data showed that when users were notified that their comment or post had upset people who saw it, they often deleted it of their own accord. “Even if you don’t mandate behaviors,” said Krieger, “you can at least send signals about what behaviors aren’t welcome.”
  • But among the ranks of Meta’s senior middle management, Bejar and Krieger said, BEEF hit a wall. Managers who had made their careers on incrementally improving prevalence statistics weren’t receptive to the suggestion that the approach wasn’t working. 
  • After three decades in Silicon Valley, he understood that members of the company’s C-Suite might not appreciate a damning appraisal of the safety risks young users faced from its product—especially one citing the company’s own data. 
  • “This was the email that my entire career in tech trained me not to send,” he says. “But a part of me was still hoping they just didn’t know.”
  • “Policy enforcement is analogous to the police,” he wrote in the email Oct. 5, 2021—arguing that it’s essential to respond to crime, but that it’s not what makes a community safe. Meta had an opportunity to do right by its users and take on a problem that Bejar believed was almost certainly industrywide.
  • fter Haugen’s airing of internal research, Meta had cracked down on the distribution of anything that would, if leaked, cause further reputational damage. With executives privately asserting that the company’s research division harbored a fifth column of detractors, Meta was formalizing a raft of new rules for employees’ internal communication.
  • Among the mandates for achieving “Narrative Excellence,” as the company called it, was to keep research data tight and never assert a moral or legal duty to fix a problem.
  • “I had to write about it as a hypothetical,” Bejar said. Rather than acknowledging that Instagram’s survey data showed that teens regularly faced unwanted sexual advances, the memo merely suggested how Instagram might help teens if they faced such a problem.
  • The hope that the team’s work would continue didn’t last. The company stopped conducting the specific survey behind BEEF, then laid off most everyone who’d worked on it as part of what Zuckerberg called Meta’s “year of efficiency.
  • If Meta was to change, Bejar told the Journal, the effort would have to come from the outside. He began consulting with a coalition of state attorneys general who filed suit against the company late last month, alleging that the company had built its products to maximize engagement at the expense of young users’ physical and mental health. Bejar also got in touch with members of Congress about where he believes the company’s user-safety efforts fell short. 
« First ‹ Previous 101 - 103 of 103
Showing 20 items per page