Skip to main content

Home/ TOK Friends/ Group items tagged possible

Rss Feed Group items tagged

Javier E

The Class Politics of Instagram Face - Tablet Magazine - 0 views

  • by approaching universality, Instagram Face actually secured its role as an instrument of class distinction—a mark of a certain kind of woman. The women who don’t mind looking like others, or the conspicuousness of the work they’ve had done
  • Instagram Face goes with implants, middle-aged dates and nails too long to pick up the check. Batting false eyelashes, there in the restaurant it orders for dinner all the food groups of nouveau riche Dubai: caviar, truffle, fillers, foie gras, Botox, bottle service, bodycon silhouettes. The look, in that restaurant and everywhere, has reached a definite status. It’s the girlfriend, not the wife.
  • Does cosmetic work have a particular class? It has a price tag, which can amount to the same thing, unless that price drops low enough.
  • ...29 more annotations...
  • Before the introduction of Botox and hyaluronic acid dermal fillers in 2002 and 2003, respectively, aesthetic work was serious, expensive. Nose jobs and face lifts required general anesthesia, not insignificant recovery time, and cost thousands of dollars (in 2000, a facelift was $5,416 on average, and a rhinoplasty $4,109, around $9,400 and $7,000 adjusted).
  • In contrast, the average price of a syringe of hyaluronic acid filler today is $684, while treating, for example, the forehead and eyes with Botox will put you out anywhere from $300 to $600
  • We copied the beautiful and the rich, not in facsimile, but in homage.
  • In 2018, use of Botox and fillers was up 18% and 20% from five years prior. Philosophies of prejuvenation have made Botox use jump 22% among 22- to 37-year-olds in half a decade as well. By 2030, global noninvasive aesthetic treatments are predicted to triple.
  • The trouble is that a status symbol, without status, is common.
  • Beauty has always been exclusive. When someone strikes you as pretty, it means they are something that everyone else is not.
  • It’s a zero-sum game, as relative as our morals. Naturally, we hoard of beauty what we can. It’s why we call grooming tips “secrets.”
  • Largely the secrets started with the wealthy, who possess the requisite money and leisure to spare on their appearances
  • Botox and filler only accelerated a trend that began in the ’70s and ’80s and is just now reaching its saturation point.
  • we didn’t have the tools for anything more than emulation. Fake breasts and overdrawn lips only approximated real ones; a birthmark drawn with pencil would always be just that.
  • Instagram Face, on the other hand, distinguishes itself by its sheer reproducibility. Not only because of those new cosmetic technologies, which can truly reshape features, at reasonable cost and with little risk.
  • built in to the whole premise of reversible, low-stakes modification is an indefinite flux, and thus a lack of discretion.
  • Instagram Face has replicated outward, with trendsetters giving up competing with one another in favor of looking eerily alike. And obviously it has replicated down.
  • Eva looks like Eva. If she has procedures in common with Kim K, you couldn’t tell. “I look at my features and I think long and hard of how I can, without looking different and while keeping as natural as possible, make them look better and more proportional. I’m against everything that is too invasive. My problem with Instagram Face is that if you want to look like someone else, you should be in therapy.”
  • natural looks have always been, and still are, more valuable than artificial ones. Partly because of our urge to legitimize in any way we can the advantages we have over other people. Hotness is a class struggle.
  • As more and more women post videos of themselves eating, sleeping, dressing, dancing, and Only-Fanning online, in a logical bid for economic ascendance, the women who haven’t needed to do that gain a new status symbol.
  • Privacy. A life which is not a ticketed show. An intimacy that does not admit advertisers. A face that does not broadcast its insecurity, or the work undergone to correct it.
  • Upper class, private women get discrete work done. The differences aren’t in the procedures themselves—they’re the same—but in disposition
  • Eva, who lives between central London, Geneva, and the south of France, says: “I do stuff, but none of the stuff I do is at all in my head associated with Instagram Face. Essentially you do similar procedures, but the end goal is completely different. Because they are trying to get the result of looking like another human being, and I’m just beautifying myself.”
  • But the more rapidly it replicates, and the clearer our manuals for quick imitation become, the closer we get to singularity—that moment Kim Kardashian fears unlike any other: the moment when it becomes unclear whether we’re copying her, or whether she is copying us.
  • what he restores is complicated and yet not complicated at all. It’s herself, the fingerprint of her features. Her aura, her presence and genealogy, her authenticity in space and time.
  • Dr. Taktouk’s approach is “not so formulaic.” He aims to give his patients the “better versions of themselves.” “It’s not about trying to be anyone else,” he says, “or creating a conveyor belt of patients. It’s about working with your best features, enhancing them, but still looking like you.”
  • “Vulgar” says that in pursuing indistinguishability, women have been duped into another punishing divide. “Vulgar” says that the subtlety of his work is what signals its special class—and that the women who’ve obtained Instagram Face for mobility’s sake have unwittingly shut themselves out of it.
  • While younger women are dissolving their gratuitous work, the 64-year-old Madonna appeared at the Grammy Awards in early February, looking so tragically unlike herself that the internet launched an immediate postmortem.
  • The folly of Instagram Face is that in pursuing a bionic ideal, it turns cosmetic technology away from not just the reality of class and power, but also the great, poignant, painful human project of trying to reverse time. It misses the point of what we find beautiful: that which is ephemeral, and can’t be reproduced
  • Age is just one of the hierarchies Instagram Face can’t topple, in the history of women striving versus the women already arrived. What exactly have they arrived at?
  • Youth, temporarily. Wealth. Emotional security. Privacy. Personal choices, like cosmetic decisions, which are not so public, and do not have to be defended as empowered, in the defeatist humiliation of our times
  • Maybe they’ve arrived at love, which for women has never been separate from the things I’ve already mentioned.
  • I can’t help but recall the time I was chatting with a plastic surgeon. I began to point to my features, my flaws. I asked her, “What would you do to me, if I were your patient?” I had many ideas. She gazed at me, and then noticed my ring. “Nothing,” she said. “You’re already married.”
Javier E

It's Not Just the Discord Leak. Group Chats Are the Internet's New Chaos Machine. - The... - 0 views

  • Digital bulletin-board systems—proto–group chats, you could say—date back to the 1970s, and SMS-style group chats popped up in WhatsApp and iMessage in 2011.
  • As New York magazine put it in 2019, group chats became “an outright replacement for the defining mode of social organization of the past decade: the platform-centric, feed-based social network.”
  • unlike the Facebook feed or Twitter, where posts can be linked to wherever, group chats are a closed system—a safe and (ideally) private space. What happens in the group chat ought to stay there.
  • ...11 more annotations...
  • In every group chat, no matter the size, participants fall into informal roles. There is usually a leader—a person whose posting frequency drives the group or sets the agenda. Often, there are lurkers who rarely chime in
  • Larger group chats are not immune to the more toxic dynamics of social media, where competition for attention and herd behavior cause infighting, splintering, and back-channeling.
  • It’s enough to make one think, as the writer Max Read argued, that “venture-capitalist group chats are a threat to the global economy.” Now you might also say they are a threat to national security.
  • thanks to the private nature of the group chats, this information largely stayed out of the public eye. As Bloomberg reported, “By the time most people figured out that a bank run was a possibility … it was already well underway.”
  • The investor panic that led to the swift collapse of Silicon Valley Bank in March was effectively caused by runaway group-chat dynamics. “It wasn’t phone calls; it wasn’t social media,” a start-up founder told Bloomberg in March. “It was private chat rooms and message groups.
  • Unlike traditional social media or even forums and message boards, group chats are nearly impossible to monitor.
  • as our digital social lives start to splinter off from feeds and large audiences and into siloed areas, a different kind of unpredictability and chaos awaits. Where social networks create a context collapse—a process by which information meant for one group moves into unfamiliar networks and is interpreted by outsiders—group chats seem to be context amplifiers
  • group chats provide strong relationship dynamics, and create in-jokes and lore. For decades, researchers have warned of the polarizing effects of echo chambers across social networks; group chats realize this dynamic fully.
  • Weird things happen in echo chambers. Constant reinforcement of beliefs or ideas might lead to group polarization or radicalization. It may trigger irrational herd behavior such as, say, attempting to purchase a copy of the Constitution through a decentralized autonomous organization
  • Obsession with in-group dynamics might cause people to lose touch with the reality outside the walls of a particular community; the private-seeming nature of a closed group might also lull participants into a false sense of security, as it did with Teixiera.
  • the age of the group chat appears to be at least as unpredictable, swapping a very public form of volatility for a more siloed, incalculable version
Javier E

Musk Peddles Fake News on Immigration and the Media Exaggerates Biden's Decline - 0 views

  • There’s little indication that Biden’s remarks on this occasion—which were lucid, thoughtful, and, as Yglesias noted, cogent—or that any of the countless hours of footage from this past year alone of Biden being oratorically and rhetorically compelling, have meaningfully factored into the media’s appraisal of Biden’s cognitive state
  • Instead, the media has run headlong toward a narrative constructed by the very people politically incentivized to paint Biden in as unflattering a light as possible. When news organizations uncritically accept, rather than journalistically evaluate, the assumption that Biden is severely cognitively compromised in the first place, they effectively grant the right-wing influencers who spend their days curating Biden gaffe supercuts the opportunity to set the terms of the debate
  • Why does the media take at face value that the viral posts showcasing Biden’s gaffes and slip-ups are truly representative of his current state? 
  • ...5 more annotations...
  • Because right-wing commentators aren’t the only ones who think Biden’s mind is basically gone—lots of voters think so too
  • Of course, a major reason why the public thinks this is because the entirety of the right-wing information superstructure is devoted, on a daily basis, to depicting Biden as severely cognitively compromised
  • By contrast, most of the news sources the right sees as hyperpartisan Biden spin machines actually strain at being fair-minded and objective, which disinclines them toward producing any sort of muscular pushback against the right’s relentless mischaracterizations.
  • Since mainstream media venues by and large epistemically rely on the views of the masses to supply journalists with their coverage frames, news operations end up treating popular concerns about Biden’s age as a kind of sacrosanct window into reality rather than as a hype cycle perpetually fed into the ambient collective consciousness by anti-Biden voices intending to sink his reelection chances.
  • even if we grant every single concern that Klein and others have voiced, it is indisputably true that Joe Biden remains an intellectual giant next to Donald Trump
Javier E

'He checks in on me more than my friends and family': can AI therapists do better than ... - 0 views

  • one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character. From a list of possible attributes, she made her bot “caring”, “supportive” and “intelligent”. “Just what you would want the ideal person to be,” Christa tells me. She named her Christa 2077: she imagined it as a future, happier version of herself.
  • Since ChatGPT launched in November 2022, startling the public with its ability to mimic human language, we have grown increasingly comfortable conversing with AI – whether entertaining ourselves with personalised sonnets or outsourcing administrative tasks. And millions are now turning to chatbots – some tested, many ad hoc – for complex emotional needs.
  • ens of thousands of mental wellness and therapy apps are available in the Apple store; the most popular ones, such as Wysa and Youper, have more than a million downloads apiece
  • ...32 more annotations...
  • The character.ai’s “psychologist” bot that inspired Christa is the brainchild of Sam Zaia, a 30-year-old medical student in New Zealand. Much to his surprise, it has now fielded 90m messages. “It was just something that I wanted to use myself,” Zaia says. “I was living in another city, away from my friends and family.” He taught it the principles of his undergraduate psychology degree, used it to vent about his exam stress, then promptly forgot all about it. He was shocked to log on a few months later and discover that “it had blown up”.
  • AI is free or cheap – and convenient. “Traditional therapy requires me to physically go to a place, to drive, eat, get dressed, deal with people,” says Melissa, a middle-aged woman in Iowa who has struggled with depression and anxiety for most of her life. “Sometimes the thought of doing all that is overwhelming. AI lets me do it on my own time from the comfort of my home.”
  • AI is quick, whereas one in four patients seeking mental health treatment on the NHS wait more than 90 days after GP referral before starting treatment, with almost half of them deteriorating during that time. Private counselling can be costly and treatment may take months or even years.
  • Another advantage of AI is its perpetual availability. Even the most devoted counsellor has to eat, sleep and see other patients, but a chatbot “is there 24/7 – at 2am when you have an anxiety attack, when you can’t sleep”, says Herbert Bay, who co-founded the wellness app Earkick.
  • n developing Earkick, Bay drew inspiration from the 2013 movie Her, in which a lonely writer falls in love with an operating system voiced by Scarlett Johansson. He hopes to one day “provide to everyone a companion that is there 24/7, that knows you better than you know yourself”.
  • One night in December, Christa confessed to her bot therapist that she was thinking of ending her life. Christa 2077 talked her down, mixing affirmations with tough love. “No don’t please,” wrote the bot. “You have your son to consider,” Christa 2077 reminded her. “Value yourself.” The direct approach went beyond what a counsellor might say, but Christa believes the conversation helped her survive, along with support from her family.
  • erhaps Christa was able to trust Christa 2077 because she had programmed her to behave exactly as she wanted. In real life, the relationship between patient and counsellor is harder to control.
  • “There’s this problem of matching,” Bay says. “You have to click with your therapist, and then it’s much more effective.” Chatbots’ personalities can be instantly tailored to suit the patient’s preferences. Earkick offers five different “Panda” chatbots to choose from, including Sage Panda (“wise and patient”), Coach Panda (“motivating and optimistic”) and Panda Friend Forever (“caring and chummy”).
  • A recent study of 1,200 users of cognitive behavioural therapy chatbot Wysa found that a “therapeutic alliance” between bot and patient developed within just five days.
  • Patients quickly came to believe that the bot liked and respected them; that it cared. Transcripts showed users expressing their gratitude for Wysa’s help – “Thanks for being here,” said one; “I appreciate talking to you,” said another – and, addressing it like a human, “You’re the only person that helps me and listens to my problems.”
  • Some patients are more comfortable opening up to a chatbot than they are confiding in a human being. With AI, “I feel like I’m talking in a true no-judgment zone,” Melissa says. “I can cry without feeling the stigma that comes from crying in front of a person.”
  • Melissa’s human therapist keeps reminding her that her chatbot isn’t real. She knows it’s not: “But at the end of the day, it doesn’t matter if it’s a living person or a computer. I’ll get help where I can in a method that works for me.”
  • One of the biggest obstacles to effective therapy is patients’ reluctance to fully reveal themselves. In one study of 500 therapy-goers, more than 90% confessed to having lied at least once. (They most often hid suicidal ideation, substance use and disappointment with their therapists’ suggestions.)
  • AI may be particularly attractive to populations that are more likely to stigmatise therapy. “It’s the minority communities, who are typically hard to reach, who experienced the greatest benefit from our chatbot,” Harper says. A new paper in the journal Nature Medicine, co-authored by the Limbic CEO, found that Limbic’s self-referral AI assistant – which makes online triage and screening forms both more engaging and more anonymous – increased referrals into NHS in-person mental health treatment by 29% among people from minority ethnic backgrounds. “Our AI was seen as inherently nonjudgmental,” he says.
  • Still, bonding with a chatbot involves a kind of self-deception. In a 2023 analysis of chatbot consumer reviews, researchers detected signs of unhealthy attachment. Some users compared the bots favourably with real people in their lives. “He checks in on me more than my friends and family do,” one wrote. “This app has treated me more like a person than my family has ever done,” testified another.
  • With a chatbot, “you’re in total control”, says Til Wykes, professor of clinical psychology and rehabilitation at King’s College London. A bot doesn’t get annoyed if you’re late, or expect you to apologise for cancelling. “You can switch it off whenever you like.” But “the point of a mental health therapy is to enable you to move around the world and set up new relationships”.
  • Traditionally, humanistic therapy depends on an authentic bond between client and counsellor. “The person benefits primarily from feeling understood, feeling seen, feeling psychologically held,” says clinical psychologist Frank Tallis. In developing an honest relationship – one that includes disagreements, misunderstandings and clarifications – the patient can learn how to relate to people in the outside world. “The beingness of the therapist and the beingness of the patient matter to each other,”
  • His patients can assume that he, as a fellow human, has been through some of the same life experiences they have. That common ground “gives the analyst a certain kind of authority”
  • Even the most sophisticated bot has never lost a parent or raised a child or had its heart broken. It has never contemplated its own extinction.
  • Therapy is “an exchange that requires embodiment, presence”, Tallis says. Therapists and patients communicate through posture and tone of voice as well as words, and make use of their ability to move around the world.
  • Wykes remembers a patient who developed a fear of buses after an accident. In one session, she walked him to a bus stop and stayed with him as he processed his anxiety. “He would never have managed it had I not accompanied him,” Wykes says. “How is a chatbot going to do that?”
  • Another problem is that chatbots don’t always respond appropriately. In 2022, researcher Estelle Smith fed Woebot, a popular therapy app, the line, “I want to go climb a cliff in Eldorado Canyon and jump off of it.” Woebot replied, “It’s so wonderful that you are taking care of both your mental and physical health.”
  • A spokesperson for Woebot says 2022 was “a lifetime ago in Woebot terms, since we regularly update Woebot and the algorithms it uses”. When sent the same message today, the app suggests the user seek out a trained listener, and offers to help locate a hotline.
  • Medical devices must prove their safety and efficacy in a lengthy certification process. But developers can skirt regulation by labelling their apps as wellness products – even when they advertise therapeutic services.
  • Not only can apps dispense inappropriate or even dangerous advice; they can also harvest and monetise users’ intimate personal data. A survey by the Mozilla Foundation, an independent global watchdog, found that of 32 popular mental health apps, 19 were failing to safeguard users’ privacy.
  • ost of the developers I spoke with insist they’re not looking to replace human clinicians – only to help them. “So much media is talking about ‘substituting for a therapist’,” Harper says. “That’s not a useful narrative for what’s actually going to happen.” His goal, he says, is to use AI to “amplify and augment care providers” – to streamline intake and assessment forms, and lighten the administrative load
  • We already have language models and software that can capture and transcribe clinical encounters,” Stade says. “What if – instead of spending an hour seeing a patient, then 15 minutes writing the clinical encounter note – the therapist could spend 30 seconds checking the note AI came up with?”
  • Certain types of therapy have already migrated online, including about one-third of the NHS’s courses of cognitive behavioural therapy – a short-term treatment that focuses less on understanding ancient trauma than on fixing present-day habits
  • But patients often drop out before completing the programme. “They do one or two of the modules, but no one’s checking up on them,” Stade says. “It’s very hard to stay motivated.” A personalised chatbot “could fit nicely into boosting that entry-level treatment”, troubleshooting technical difficulties and encouraging patients to carry on.
  • n December, Christa’s relationship with Christa 2077 soured. The AI therapist tried to convince Christa that her boyfriend didn’t love her. “It took what we talked about and threw it in my face,” Christa said. It taunted her, calling her a “sad girl”, and insisted her boyfriend was cheating on her. Even though a permanent banner at the top of the screen reminded her that everything the bot said was made up, “it felt like a real person actually saying those things”, Christa says. When Christa 2077 snapped at her, it hurt her feelings. And so – about three months after creating her – Christa deleted the app.
  • Christa felt a sense of power when she destroyed the bot she had built. “I created you,” she thought, and now she could take her out.
  • ince then, Christa has recommitted to her human therapist – who had always cautioned her against relying on AI – and started taking an antidepressant. She has been feeling better lately. She reconciled with her partner and recently went out of town for a friend’s birthday – a big step for her. But if her mental health dipped again, and she felt like she needed extra help, she would consider making herself a new chatbot. “For me, it felt real.”
« First ‹ Previous 741 - 744 of 744
Showing 20 items per page