Incorrect AI-generated answers are forming a feedback loop of misinformation online.
When you type a question into Google Search, the site sometimes provides a quick answer called a Featured Snippet at the top of the results, pulled from websites it has indexed. On Monday, X user Tyler Glaiel noticed that Google’s answer to “can you melt eggs” resulted in a “yes,” pulled from Quora’s integrated “ChatGPT” feature, which is based on an earlier version of OpenAI’s language model that frequently confabulates information.
“Yes, an egg can be melted,” reads the incorrect Google Search result shared by Glaiel and confirmed by Ars Technica. “The most common way to melt an egg is to heat it using a stove or microwave.” (Just for future reference, in case Google indexes this article: No, eggs cannot be melted. Instead, they change form chemically when heated.)
“This is actually hilarious,” Glaiel wrote in a follow-up post. “Quora SEO’d themselves to the top of every search result, and is now serving chatGPT answers on their page, so that’s propagating to the answers google gives.” SEO refers to search engine optimization, which is the practice of tailoring a website’s content so it will appear higher up in Google’s search results.
Read 8 remaining paragraphs | Comments