Human-like programs abuse our empathy - even Google engineers aren't immune | Emily M B... - 0 views
-
dr tech on 18 Jun 22"That is why we must demand transparency here, especially in the case of technology that uses human-like interfaces such as language. For any automated system, we need to know what it was trained to do, what training data was used, who chose that data and for what purpose. In the words of AI researchers Timnit Gebru and Margaret Mitchell, mimicking human behaviour is a "bright line" - a clear boundary not to be crossed - in computer software development. We treat interactions with things we perceive as human or human-like differently. With systems such as LaMDA we see their potential perils and the urgent need to design systems in ways that don't abuse our empathy or trust."