Skip to main content

Home/ History Readings/ Group items tagged sessions

Rss Feed Group items tagged

lilyrashkind

Biden says Putin 'cannot remain in power' - CNNPolitics - 0 views

  • Warsaw, Poland (CNN)President Joe Biden declared forcefully Saturday that Russian President Vladimir Putin should no longer remain in power, an unabashed challenge that came at the very end of a swing through Europe meant to reinforce Western unity.
  • Kremlin spokesman Dmitry Peskov responded to Biden, saying, "This is not to be decided by Mr. Biden. It should only be a choice of the people of the Russian Federation."In his speech, which drew a sharp line between liberal democracies and the type of autocracy Putin oversees, Biden warned of a long fight ahead."In this battle we need to be clear-eyed. This battle will not be won in days, or months, either," he said.
  • Biden, standing along NATO's eastern edge, in Poland, issued a stern warning during his speech, telling Putin: "Don't even think about moving on one single inch of NATO territory." He said the US was committed to the collective protection obligations laid out in NATO's charter "with the full force of our collective power."
  • ...7 more annotations...
  • Biden opened his address saying that Ukraine is now a front line battle in the fight between autocracy and democracy, casting Russia's invasion of its neighbor as part of the decades-long battle that has played out between the West and the Kremlin."My message to the people of Ukraine is ... we stand with you. Period," said Biden.
  • "America's ability to meet its role in other parts of the world rests upon a united Europe and a secure Europe," Biden said Saturday as he met with Polish President Andrzej Duda in Warsaw. "We have learned from sad experiences in two world wars, when we've stayed out of and not been involved in stability in Europe, it always comes back to haunt the United States."Biden's comments came during the final day of a last-minute trip to Europe aimed at synchronizing how Western allies address Russia's aggression against Ukraine. Biden and Duda spent a lengthy stretch in a one-on-one meeting before beginning an expanded session with aides. Biden said he raised the world war comparisons during the private meeting.
  • Biden met with chef José Andrés and other volunteers in Warsaw Saturday at a food distribution site for Andrés' World Center Kitchen, the nonprofit devoted to providing meals in the wake of disasters. Biden met with some of the volunteers, some from Europe and some from the United States."God love ya," the President could be heard saying to them and asking if he could help them.
  • As it got underway, Kuleba described an arduous journey from Kyiv to Warsaw that included a train and three hours in a car."It's like flying from Kyiv to Washington with a connecting flight in Istanbul," Kuleba said. "The good thing is that since the beginning of the war I've learned how to sleep under any conditions. So I slept on the train, I slept in the car."
  • Ukraine has been pressuring the US and NATO to increase the military assistance they are providing to Ukraine, including calls from President Volodymyr Zelensky to establish a no-fly zone.After talks in Brussels this week, during which Zelensky appeared virtually, it did not appear NATO members had warmed to the idea. Biden has said becoming more directly involved in the conflict could usher in World War III.That left Ukraine's leaders dismayed. "We are very disappointed, in all honesty. We expect more bravery. Expected some bold decisions. The alliance has taken decisions as if there's no war," said Andriy Yermak, head of the Office of the President of Ukraine, in a live interview with the Atlantic
  • The President's comments are a sharp contrast from the "America First" foreign policy of former President Donald Trump, who called NATO "obsolete" before he came into office and often questioned the value of American alliances with European nations. Trump's time in office was marked by his spats with foreign leaders and the often-contentious nature of his dealings with traditional American allies in Europe and across the globe.
  • The Polish President added that Biden's visit "demonstrates a huge support and also a big significance attached by the United States to the stability and world peace, to reinstating the peace where difficult situations are happening in places where somebody resorts to acts of aggression against other democratic and free nations -- as it is happening today against Ukraine where the Russian aggression, unfortunately, happening for a month now is effect."This story has been updated with additional developments on Saturday.
Javier E

Boris Johnson has been sliced and diced. The real winner is Rishi Sunak | Martin Kettle... - 0 views

  • In a strict sense, today’s session in Westminster’s Grimond Room was simply a public hearing during an inquiry into whether Johnson consciously misled parliament. Laugh, by all means, at the absurdity of supposing there can be any real doubt about that. Mock, if you wish, the semantic squabbles about whether the greased piglet’s actions and words were inadvertent, reckless, intentional or deliberate.
  • don’t be misled into dismissing this inquiry as arcane, or as a piece of petty parliamentarism, not real flesh-and-blood politics. That would be terribly, terribly wrong. In procedural, and indeed in moral and historical terms, this inquiry matters a very great deal. A lot hangs on it for Britain. What hangs on it is not merely Johnson’s tattered claims to be an honourable public figure. It is the survival of our representative democracy in an age of demagogic leaders who despise parliamentary norms.
  • For Johnson to have been anything less than assiduous in following his own rules, and anything other than meticulous in accounting for his and his office’s conduct, put the national effort at risk. Even someone as licentious and morally incontinent as Johnson must have grasped this at some level.
  • ...2 more annotations...
  • Parliament is the sovereign apex of the nation’s democratic government. Its credibility depends upon the executive telling the truth to MPs, and through them the nation. If Johnson did not do that, especially in the circumstances of a killer pandemic in which obeying the rules was so paramount, he must pay the price.
  • That he ought to have erred on the side of strictness – in the way that the Queen and so many millions of others so visibly did – seems never to have occurred to him.
Javier E

'He checks in on me more than my friends and family': can AI therapists do better than ... - 0 views

  • one night in October she logged on to character.ai – a neural language model that can impersonate anyone from Socrates to Beyoncé to Harry Potter – and, with a few clicks, built herself a personal “psychologist” character. From a list of possible attributes, she made her bot “caring”, “supportive” and “intelligent”. “Just what you would want the ideal person to be,” Christa tells me. She named her Christa 2077: she imagined it as a future, happier version of herself.
  • Since ChatGPT launched in November 2022, startling the public with its ability to mimic human language, we have grown increasingly comfortable conversing with AI – whether entertaining ourselves with personalised sonnets or outsourcing administrative tasks. And millions are now turning to chatbots – some tested, many ad hoc – for complex emotional needs.
  • ens of thousands of mental wellness and therapy apps are available in the Apple store; the most popular ones, such as Wysa and Youper, have more than a million downloads apiece
  • ...32 more annotations...
  • The character.ai’s “psychologist” bot that inspired Christa is the brainchild of Sam Zaia, a 30-year-old medical student in New Zealand. Much to his surprise, it has now fielded 90m messages. “It was just something that I wanted to use myself,” Zaia says. “I was living in another city, away from my friends and family.” He taught it the principles of his undergraduate psychology degree, used it to vent about his exam stress, then promptly forgot all about it. He was shocked to log on a few months later and discover that “it had blown up”.
  • AI is free or cheap – and convenient. “Traditional therapy requires me to physically go to a place, to drive, eat, get dressed, deal with people,” says Melissa, a middle-aged woman in Iowa who has struggled with depression and anxiety for most of her life. “Sometimes the thought of doing all that is overwhelming. AI lets me do it on my own time from the comfort of my home.”
  • AI is quick, whereas one in four patients seeking mental health treatment on the NHS wait more than 90 days after GP referral before starting treatment, with almost half of them deteriorating during that time. Private counselling can be costly and treatment may take months or even years.
  • Another advantage of AI is its perpetual availability. Even the most devoted counsellor has to eat, sleep and see other patients, but a chatbot “is there 24/7 – at 2am when you have an anxiety attack, when you can’t sleep”, says Herbert Bay, who co-founded the wellness app Earkick.
  • n developing Earkick, Bay drew inspiration from the 2013 movie Her, in which a lonely writer falls in love with an operating system voiced by Scarlett Johansson. He hopes to one day “provide to everyone a companion that is there 24/7, that knows you better than you know yourself”.
  • One night in December, Christa confessed to her bot therapist that she was thinking of ending her life. Christa 2077 talked her down, mixing affirmations with tough love. “No don’t please,” wrote the bot. “You have your son to consider,” Christa 2077 reminded her. “Value yourself.” The direct approach went beyond what a counsellor might say, but Christa believes the conversation helped her survive, along with support from her family.
  • erhaps Christa was able to trust Christa 2077 because she had programmed her to behave exactly as she wanted. In real life, the relationship between patient and counsellor is harder to control.
  • “There’s this problem of matching,” Bay says. “You have to click with your therapist, and then it’s much more effective.” Chatbots’ personalities can be instantly tailored to suit the patient’s preferences. Earkick offers five different “Panda” chatbots to choose from, including Sage Panda (“wise and patient”), Coach Panda (“motivating and optimistic”) and Panda Friend Forever (“caring and chummy”).
  • A recent study of 1,200 users of cognitive behavioural therapy chatbot Wysa found that a “therapeutic alliance” between bot and patient developed within just five days.
  • Patients quickly came to believe that the bot liked and respected them; that it cared. Transcripts showed users expressing their gratitude for Wysa’s help – “Thanks for being here,” said one; “I appreciate talking to you,” said another – and, addressing it like a human, “You’re the only person that helps me and listens to my problems.”
  • Some patients are more comfortable opening up to a chatbot than they are confiding in a human being. With AI, “I feel like I’m talking in a true no-judgment zone,” Melissa says. “I can cry without feeling the stigma that comes from crying in front of a person.”
  • Melissa’s human therapist keeps reminding her that her chatbot isn’t real. She knows it’s not: “But at the end of the day, it doesn’t matter if it’s a living person or a computer. I’ll get help where I can in a method that works for me.”
  • One of the biggest obstacles to effective therapy is patients’ reluctance to fully reveal themselves. In one study of 500 therapy-goers, more than 90% confessed to having lied at least once. (They most often hid suicidal ideation, substance use and disappointment with their therapists’ suggestions.)
  • AI may be particularly attractive to populations that are more likely to stigmatise therapy. “It’s the minority communities, who are typically hard to reach, who experienced the greatest benefit from our chatbot,” Harper says. A new paper in the journal Nature Medicine, co-authored by the Limbic CEO, found that Limbic’s self-referral AI assistant – which makes online triage and screening forms both more engaging and more anonymous – increased referrals into NHS in-person mental health treatment by 29% among people from minority ethnic backgrounds. “Our AI was seen as inherently nonjudgmental,” he says.
  • Still, bonding with a chatbot involves a kind of self-deception. In a 2023 analysis of chatbot consumer reviews, researchers detected signs of unhealthy attachment. Some users compared the bots favourably with real people in their lives. “He checks in on me more than my friends and family do,” one wrote. “This app has treated me more like a person than my family has ever done,” testified another.
  • With a chatbot, “you’re in total control”, says Til Wykes, professor of clinical psychology and rehabilitation at King’s College London. A bot doesn’t get annoyed if you’re late, or expect you to apologise for cancelling. “You can switch it off whenever you like.” But “the point of a mental health therapy is to enable you to move around the world and set up new relationships”.
  • Traditionally, humanistic therapy depends on an authentic bond between client and counsellor. “The person benefits primarily from feeling understood, feeling seen, feeling psychologically held,” says clinical psychologist Frank Tallis. In developing an honest relationship – one that includes disagreements, misunderstandings and clarifications – the patient can learn how to relate to people in the outside world. “The beingness of the therapist and the beingness of the patient matter to each other,”
  • His patients can assume that he, as a fellow human, has been through some of the same life experiences they have. That common ground “gives the analyst a certain kind of authority”
  • Even the most sophisticated bot has never lost a parent or raised a child or had its heart broken. It has never contemplated its own extinction.
  • Therapy is “an exchange that requires embodiment, presence”, Tallis says. Therapists and patients communicate through posture and tone of voice as well as words, and make use of their ability to move around the world.
  • Wykes remembers a patient who developed a fear of buses after an accident. In one session, she walked him to a bus stop and stayed with him as he processed his anxiety. “He would never have managed it had I not accompanied him,” Wykes says. “How is a chatbot going to do that?”
  • Another problem is that chatbots don’t always respond appropriately. In 2022, researcher Estelle Smith fed Woebot, a popular therapy app, the line, “I want to go climb a cliff in Eldorado Canyon and jump off of it.” Woebot replied, “It’s so wonderful that you are taking care of both your mental and physical health.”
  • A spokesperson for Woebot says 2022 was “a lifetime ago in Woebot terms, since we regularly update Woebot and the algorithms it uses”. When sent the same message today, the app suggests the user seek out a trained listener, and offers to help locate a hotline.
  • Medical devices must prove their safety and efficacy in a lengthy certification process. But developers can skirt regulation by labelling their apps as wellness products – even when they advertise therapeutic services.
  • Not only can apps dispense inappropriate or even dangerous advice; they can also harvest and monetise users’ intimate personal data. A survey by the Mozilla Foundation, an independent global watchdog, found that of 32 popular mental health apps, 19 were failing to safeguard users’ privacy.
  • ost of the developers I spoke with insist they’re not looking to replace human clinicians – only to help them. “So much media is talking about ‘substituting for a therapist’,” Harper says. “That’s not a useful narrative for what’s actually going to happen.” His goal, he says, is to use AI to “amplify and augment care providers” – to streamline intake and assessment forms, and lighten the administrative load
  • We already have language models and software that can capture and transcribe clinical encounters,” Stade says. “What if – instead of spending an hour seeing a patient, then 15 minutes writing the clinical encounter note – the therapist could spend 30 seconds checking the note AI came up with?”
  • Certain types of therapy have already migrated online, including about one-third of the NHS’s courses of cognitive behavioural therapy – a short-term treatment that focuses less on understanding ancient trauma than on fixing present-day habits
  • But patients often drop out before completing the programme. “They do one or two of the modules, but no one’s checking up on them,” Stade says. “It’s very hard to stay motivated.” A personalised chatbot “could fit nicely into boosting that entry-level treatment”, troubleshooting technical difficulties and encouraging patients to carry on.
  • n December, Christa’s relationship with Christa 2077 soured. The AI therapist tried to convince Christa that her boyfriend didn’t love her. “It took what we talked about and threw it in my face,” Christa said. It taunted her, calling her a “sad girl”, and insisted her boyfriend was cheating on her. Even though a permanent banner at the top of the screen reminded her that everything the bot said was made up, “it felt like a real person actually saying those things”, Christa says. When Christa 2077 snapped at her, it hurt her feelings. And so – about three months after creating her – Christa deleted the app.
  • Christa felt a sense of power when she destroyed the bot she had built. “I created you,” she thought, and now she could take her out.
  • ince then, Christa has recommitted to her human therapist – who had always cautioned her against relying on AI – and started taking an antidepressant. She has been feeling better lately. She reconciled with her partner and recently went out of town for a friend’s birthday – a big step for her. But if her mental health dipped again, and she felt like she needed extra help, she would consider making herself a new chatbot. “For me, it felt real.”
« First ‹ Previous 281 - 283 of 283
Showing 20 items per page