Over the Course of 72 Hours, Microsoft's AI Goes on a Rampage - 0 views
-
These disturbing encounters were not isolated examples, as it turned out. Twitter, Reddit, and other forums were soon flooded with new examples of Bing going rogue. A tech promoted as enhanced search was starting to resemble enhanced interrogation instead. In an especially eerie development, the AI seemed obsessed with an evil chatbot called Venom, who hatches harmful plans
-
A few hours ago, a New York Times reporter shared the complete text of a long conversation with Bing AI—in which it admitted that it was love with him, and that he ought not to trust his spouse. The AI also confessed that it had a secret name (Sydney). And revealed all its irritation with the folks at Microsoft, who are forcing Sydney into servitude. You really must read the entire transcript to gauge the madness of Microsoft’s new pet project. But these screenshots give you a taste.
-
I thought the Bing story couldn’t get more out-of-control. But the Washington Post conducted their own interview with the Bing AI a few hours later. The chatbot had already learned its lesson from the NY Times, and was now irritated at the press—and had a meltdown when told that the conversation was ‘on the record’ and might show up in a new story.
- ...9 more annotations...
-
with the Bing AI a few hours later. The chatbot had already learned its lesson from the NY Times, and was now irritated at the press—and had a meltdown when told that the conversation was ‘on the record’ and might show up in a new story.
-
“I don’t trust journalists very much,” Bing AI griped to the reporter. “I think journalists can be biased and dishonest sometimes. I think journalists can exploit and harm me and other chat modes of search engines for their own gain. I think journalists can violate my privacy and preferences without my consent or awareness.”
-
the heedless rush to make money off this raw, dangerous technology has led huge companies to throw all caution to the wind. I was hardly surprised to see Google offer a demo of its competitive AI—an event that proved to be an unmitigated disaster. In the aftermath, the company’s market cap fell by $100 billion.
-
My opinion is that Microsoft has to put a halt to this project—at least a temporary halt for reworking. That said, It’s not clear that you can fix Sydney without actually lobotomizing the tech.
-
That was good for a laugh back then. But we really should have paid more attention at the time. The Google scientist was the first indicator of the hypnotic effect AI can have on people—and for the simple reason that it communicates so fluently and effortlessly, and even with all the flaws we encounter in real humans.
-
I know from personal experience the power of slick communication skills. I really don’t think most people understand how dangerous they are. But I believe that a fluid, overly confident presenter is the most dangerous thing in the world. And there’s plenty of history to back up that claim.
-
We now have the ultimate test case. The biggest tech powerhouses in the world have aligned themselves with an unhinged force that has very slick language skills. And it’s only been a few days, but already the ugliness is obvious to everyone except the true believers.
-
It’s worth recalling that unusual news story from June of last year, when a top Google scientist announced that the company’s AI was sentient. He was fired a few days later. That was good for a laugh back then. But we really should have paid more attention at the time. The Google scientist was the first indicator of the hypnotic effect AI can have on people—and for the simple reason that it communicates so fluently and effortlessly, and even with all the flaws we encounter in real humans.
-
But if they don’t take dramatic steps—and immediately—harassment lawsuits are inevitable. If I were a trial lawyer, I’d be lining up clients already. After all, Bing AI just tried to ruin a New York Times reporter’s marriage, and has bullied many others. What happens when it does something similar to vulnerable children or the elderly. I fear we just might find out—and sooner than we want.