How a Google Employee Fell for the Eliza Effect - The Atlantic - 0 views
-
dr tech on 23 Jun 22"A Google employee named Blake Lemoine was put on leave recently after claiming that one of Google's artificial-intelligence language models, called LaMDA (Language Models for Dialogue Applications), is sentient. He went public with his concerns, sharing his text conversations with LaMDA. At one point, Lemoine asks, "What does the word 'soul' mean to you?" LaMDA answers, "To me, the soul is a concept of the animating force behind consciousness and life itself." "I was inclined to give it the benefit of the doubt," Lemoine explained, citing his religious beliefs. "Who am I to tell God where he can and can't put souls?""