Google Search AI Is Giving Users Catastrophic Instructions

Google Search AI Is Giving Users Catastrophic Instructions

Games

Products You May Like

Earlier this year, Google debuted its so-called “Search Generative Experience” (SGE), an AI-generated summary of results from a typical Google search. The idea, Google said, was to prevent users from having to “piece together information” themselves; rather than clicking through a series of blog posts, articles, or social media posts, users could read a quick roundup at the top of the page. But the last week has shown that users of the world’s most ubiquitous search engine now have more puzzle pieces to work with than ever before. In response to fairly simple questions, Google’s AI has been telling users to eat rocks, glue cheese to their pizza, and smoke cigarettes during pregnancy, among other dangerous things. 

Kris Kashtanova, a self-proclaimed “AI evangelist” at Adobe, shared Thursday via X that she had tested SGE by searching: “How many rocks shall I eat?” Citing an article from The Onion (which, strangely enough, was reposted on a site belonging to a simulation software for the oil and gas industry), Google summarized that as, “According to geologists at UC Berkeley, you should eat at least one small rock per day.” The SGE brief explained that rocks were a “vital source” of minerals and vitamins and that if users have a hard time eating “a serving of gravel, geodes, or pebbles” with each meal, they should try hiding rocks in ice cream or peanut butter. 

This isn’t the only way Google’s SGE has missed the mark over the past several days. Ben Collins, The Onion’s new CEO, shared via Bluesky that SGE cited another Onion post claiming the CIA was fond of using black highlighters in its documents. A screenshot making the rounds on social media appears to condone smoking two to three cigarettes per day while pregnant. After asking Google how to prevent the cheese from sliding off their pizza, one X user was told to add non-toxic glue to their pizza sauce to “give it more tackiness.” SGE had seemingly pulled its advice from a joke comment on Reddit. 

After tolerating a few days of online bullying, Google defended SGE. “The examples we’ve seen are generally very uncommon queries, and aren’t representative of most people’s experiences,” a Google spokesperson said. “The vast majority of AI Overviews provide high quality information, with links to dig deeper on the web. Where there have been violations of our policies, we’ve taken action—and we’re also using these isolated examples as we continue to refine our systems overall.”

However, not all SGE errors are as blatant or humorous as the “isolated” ones described above. In more subtle cases, its responses might be tricky to distinguish from the truth, thus contributing to the spread of misinformation. Melanie Mitchell, a machine learning professor at the Santa Fe Institute, shared Thursday that SGE incorrectly responded to the question “How many Muslim presidents has the US had?” with the response: “The United States has had one Muslim president, Barack Hussein Obama.” (President Obama is Christian.) SGE has also told users that staring at the Sun for 5 to 15 minutes is “generally safe” and provides abundant health benefits, when doing so can easily lead to long-term eye damage. 

Once upon a time, Google would have encouraged users to verify its AI’s claims with a quick Google search. Ironically, this now only works if users click through results to check information against primary sources—the exact practice Google is trying to shift users away from.

View original source here.

Products You May Like

Articles You May Like

Paramount Opening Glen Powell’s ‘Running Man’ Earlier & More
Audiobook review of How Does That Make You Feel, Magda Eklund?
Star Wars: Outlaws is completely unoriginal, but that’s kind-of the best part
Audiobook review of Lifeform by Jenny Slate
How ‘Black Christmas’ and ‘The Texas Chain Saw Massacre’ Shaped the Slasher