"A cautionary tale: Back in 2009, government contractor Jeffrey Kantor was browsing online, seeking to make a radio-controlled airplane for his son. He began to type his search into Google: "How do I build a radio-controlled"-[enter autocomplete]-"bomb." That's right, before Kantor knew it, he had accidentally asked Google how to make an explosive device. And his life would never be the same."
"Researchers fed these algorithms (which function like autocomplete, but for images) pictures of a man cropped below his neck: 43% of the time the image was autocompleted with the man wearing a suit. When you fed the same algorithm a similarly cropped photo of a woman, it auto-completed her wearing a low-cut top or bikini a massive 53% of the time. For some reason, the researchers gave the algorithm a picture of the Democratic congresswoman Alexandria Ocasio-Cortez and found that it also automatically generated an image of her in a bikini. (After ethical concerns were raised on Twitter, the researchers had the computer-generated image of AOC in a swimsuit removed from the research paper.)"
"If AI causes mass unemployment among the general populace, it will be the first time in history that any technology has ever done that. Industrial machinery, computer-controlled machine tools, software applications, and industrial robots all caused panics about human obsolescence, and nothing of the kind ever came to pass; pretty much everyone who wants a job still has a job. As Noah has written, a wave of recent evidence shows that adoption of industrial robots and automation technology in general is associated with an increase in employment at the company and industry level."
"My entry point into this story began, as so many things do, with a late-night Google. Last December, I took an unsettling tumble into a wormhole of Google autocomplete suggestions that ended with "did the holocaust happen". And an entire page of results that claimed it didn't."
"As Tremblay notes, a major problem for AI chatbots like ChatGPT and Bard is their tendency to confidently state incorrect information as fact. The systems frequently "hallucinate" - that is, make up information - because they are essentially autocomplete systems."