Google's Relationship With Facts Is Getting Wobblier - The Atlantic - 0 views
-
Misinformation or even disinformation in search results was already a problem before generative AI. Back in 2017, The Outline noted that a snippet once confidently asserted that Barack Obama was the king of America.
-
This is what experts have worried about since ChatGPT first launched: false information confidently presented as fact, without any indication that it could be totally wrong. The problem is “the way things are presented to the user, which is Here’s the answer,” Chirag Shah, a professor of information and computer science at the University of Washington, told me. “You don’t need to follow the sources. We’re just going to give you the snippet that would answer your question. But what if that snippet is taken out of context?”
-
Responding to the notion that Google is incentivized to prevent users from navigating away, he added that “we have no desire to keep people on Google.
- ...15 more annotations...