Skip to main content

Home/ History Readings/ Group items tagged Disabilities

Rss Feed Group items tagged

4More

More Young People Are on Multiple Psychiatric Drugs, Study Finds - The New York Times - 0 views

  • The study, published Friday in JAMA Open Network, looked at the prescribing patterns among patients 17 or younger enrolled in Medicaid from 2015 to 2020 in a single U.S. state that the researchers declined to name. In this group, there was a 9.5 percent increase in the prevalence of “polypharmacy,” which the study defined as taking three or more different classes of psychiatric medications, including antidepressants, mood-stabilizing anticonvulsants, sedatives and drugs for A.D.H.D. and anxiety drugs.
  • One recent paper drew data from the National Ambulatory Medical Care Survey and found that in 2015, 40.7 percent of people aged 2 to 24 in the United States who took a medication for A.D.H.D. also took a second psychiatric drug. That figure had risen from 26 percent in 2006.
  • at least in one state, the practice continues to grow and “was significantly more likely among youths who were disabled or in foster care,” the new study noted.
  • ...1 more annotation...
  • The latest study looked at data from 126,972 people over the study period. It found that in 2015, 4.2 percent of Medicaid enrollees under the age of 17 in Maryland had overlapping prescriptions of three or more different classes of psychiatric medications. That figure rose to 4.6 percent in 2020.
31More

The Influencer Is a Young Teenage Girl. The Audience Is 92% Adult Men. - WSJ - 0 views

  • Instagram makes it easy for strangers to find photos of children, and its algorithm is built to identify users’ interests and push similar content. Investigations by The Wall Street Journal and outside researchers have found that, upon recognizing that an account might be sexually interested in children, Instagram’s algorithm recommends child accounts for the user to follow, as well as sexual content related to both children and adults.
  • That algorithm has become the engine powering the growth of an insidious world in which young girls’ online popularity is perversely predicated on gaining large numbers of male followers. 
  • Instagram photos of young girls become a dark currency, swapped and discussed obsessively among men on encrypted messaging apps such as Telegram. The Journal reviewed dozens of conversations in which the men fetishized specific body parts and expressed pleasure in knowing that many parents of young influencers understand that hundreds, if not thousands, of pedophiles have found their children online.   
  • ...28 more annotations...
  • One man, speaking about one of his favorite young influencers in a Telegram exchange captured by a child-safety activist, said that her mother knew “damn well” that many of her daughter’s followers were “pervy adult men.”
  • Meta looms over everything young influencers do on Instagram. It connects their accounts with strangers, and it can upend their star turns when it chooses. The company periodically shuts down accounts if it determines they have violated policies against child sexual exploitation or abuse. Some parents say their accounts have been shut down without such violations. 
  • Over the course of reporting this story, during which time the Journal inquired about the account the mom managed for her daughter, Meta shut down the account twice. The mom said she believed she hadn’t violated Meta’s policies. 
  • Meta’s guidance for content creators stresses the importance of engaging with followers to keep them and attract new ones. The hundreds of comments on any given post included some from other young fashion influencers, but also a large number of men leaving comments like “Gorgeous!” The mom generally liked or thanked them all, save for any that were expressly inappropriate. 
  • Meta spokesman Andy Stone said the company enables parents who run accounts for their children to control who is able to message them on Instagram or comment on their accounts. Meta’s guidance for creators also offers tips for building a safe online community, and the company has publicized a range of tools to help teens and parents achieve this.
  • Like many young girls, the daughter envied fashion influencers who made a living posting glamour content. When the mother agreed to help her daughter build her following and become an influencer, she set some rules. Her daughter wouldn’t be allowed to access the account or interact with anyone who sent messages. And they couldn’t post anything indicating exactly where they live. 
  • The mom stopped blocking so many users. Within a year of launching, the account had more than 100,000 followers. The daughter’s popularity earned her invitations to modeling events in big coastal cities where she met other young influencers. 
  • Social-media platforms have helped level the playing field for parents seeking an audience for their children’s talents. Instagram, in particular, is visually driven and easily navigable, which also makes it appealing for child-focused brands.
  • While Meta bans children under the age of 13 from independently opening social-media accounts, the company allows what it calls adult-run minor accounts, managed by parents. Often those accounts are pursuing influencer status, part of a burgeoning global influencer industry expected to be worth $480 billion by 2027, according to a recent Goldman Sachs report. 
  • Young influencers, reachable through direct messages, routinely solicit their followers for patronage, posting links to payment accounts and Amazon gift registries in their bios.
  • The Midwestern mom debated whether to charge for access to extra photos and videos via Instagram’s subscription feature. She said she has always rejected private offers to buy photos of her daughter, but she decided that offering subscriptions was different because it didn’t involve a one-on-one transaction.
  • The Journal asked Meta why it had at some points removed photos from the account. Weeks later, Meta disabled the account’s subscription feature, and then shut down the account without saying why. 
  • “There’s no personal connection,” she said. “You’re just finding a way to monetize from this fame that’s impersonal.”
  • The mom allowed the men to purchase subscriptions so long as they kept their distance and weren’t overtly inappropriate in messages and comments. “In hindsight, they’re probably the scariest ones of all,” she said. 
  • Stone, the Meta spokesman, said that the company will no longer allow accounts that primarily post child-focused content to offer subscriptions or receive gifts, and that the company is developing tools to enforce that.
  • he mom saw her daughter, though young, as capable of choosing to make money as an influencer and deciding when she felt uncomfortable. The mom saw her own role as providing the support needed for her daughter to do that.
  • The mom also discussed safety concerns with her now ex-husband, who has generally supported the influencer pursuit. In an interview, he characterized the untoward interest in his daughter as “the seedy underbelly” of the industry, and said he felt comfortable with her online presence so long as her mom posted appropriate content and remained vigilant about protecting her physical safety.
  • an anonymous person professing to be a child-safety activist sent her an email that contained screenshots and videos showing her daughter’s photos being traded on Telegram. Some of the users were painfully explicit about their sexual interest. Many of the photos were bikini or leotard photos from when the account first started.
  • Still, the mom realized she couldn’t stop men from trading the photos, which will likely continue to circulate even after her daughter becomes an adult. “Every little influencer with a thousand or more followers is on Telegram,” she said. “They just don’t know it.”
  • Early last year, Meta safety staffers began investigating the risks associated with adult-run accounts for children offering subscriptions, according to internal documents. The staffers reviewed a sample of subscribers to such accounts and determined that nearly all the subscribers demonstrated malicious behavior toward children.
  • The staffers found that the subscribers mostly liked or saved photos of children, child-sexualizing material and, in some cases, illicit underage-sex content. The users searched the platform using hashtags such as #sexualizegirls and #tweenmodel. 
  • The staffers found that some accounts with large numbers of followers sold additional content to subscribers who offered extra money on Instagram or other platforms, and that some engaged with subscribers in sexual discussions about their children. In every case, they concluded that the parents running those accounts knew that their subscribers were motivated by sexual gratification.
  • In the following months, the Journal began its own review of parent-run modeling accounts and found numerous instances where Meta wasn’t enforcing its own child-safety policies and community guidelines. 
  • The Journal asked Meta about several accounts that appeared to have violated platform rules in how they promoted photos of their children. The company deleted some of those accounts, as well as others, as it worked to address safety issues.
  • In 2022, Instagram started letting certain content creators offer paid-subscription services. At the time, the company allowed accounts featuring children to offer subscriptions if they were run or co-managed by parents.
  • The removal of the account made for a despondent week for the mom and daughter. The mother was incensed at Meta’s lack of explanation and the prospect that users had falsely reported inappropriate activity on the account. She was torn about what to do. When it was shut down, the account had roughly 80% male followers.
  • The account soon had more than 100,000 followers, about 92% of whom were male, according to the dashboard. Within months, Meta shut down that account as well. The company said the account had violated its policies related to child exploitation, but it didn’t specify how. 
  • Meta’s Stone said it doesn’t allow accounts it has previously shut down to resume the same activity on backup accounts. 
« First ‹ Previous 201 - 202 of 202
Showing 20 items per page