You are currently viewing Glue in pizza?  Eat rocks?  Google’s AI search is mocked for weird answers

Glue in pizza? Eat rocks? Google’s AI search is mocked for weird answers

When Google announced plans to roll out AI-written summaries of results for our search queries, it was billed as the biggest change to the search engine in decades. The question now is whether this change was a good idea at all.

Social media has been abuzz over the past few days as users have been sharing incredibly weird and incorrect answers that they believe have been provided by Google’s AI reviews. Users who typed questions into Google received AI-powered answers that seemed to come from a different reality.

Read more: What is Google’s AI review and why does it get things wrong?

For example: “According to geologists at the University of California, Berkeley, you should eat at least one small rock a day,” the AI ​​review answers one person’s (quite stupid) question, apparently relying on an article from the popular humor website The Onion.

A tweet citing Google's AI suggesting people eat rocks

Screenshot by Ian Sherr/CNET

In one that really captured the hearts of pizza-loving internet users, someone who asked how to make the cheese stick to the pizza was told, “you can also add about 1/8 cup of non-toxic glue to the sauce to give it more stickiness.” Yum?

A screenshot of a tweet about adding glue to pizza A screenshot of a tweet about adding glue to pizza

Screenshot by Ian Sherr/CNET

In another incorrect answer, Google shared a racist conspiracy theory: “The United States has had one Muslim president, Barack Hussein Obama, who served from 2009 to 2017.” Obama is and remains a practicing Christian.

Google AI incorrectly answers that President Obama is Muslim Google AI incorrectly answers that President Obama is Muslim

Screenshot by Ian Sherr/CNET

Another answer to the question of how to pass kidney stones suggested drinking urine. “You should aim to drink at least 2 quarts (2 liters) of urine every 24 hours,” reads the disturbing response.

Tweet suggests drinking urine as a cure for kidney stones Tweet suggests drinking urine as a cure for kidney stones

Screenshot by Ian Sherr/CNET

In response to CNET’s request for comment, a Google spokesperson specifically addressed the search claiming former President Obama is Muslim: “This particular review violated our policies and we have removed it,” they said.

The spokesperson defended Google’s AI responses, saying the “vast majority” provided accurate information. The statement also questioned whether all the unusual answers floating around were accurate, saying some examples were faked or couldn’t be replicated internally by Googlers.

“We conducted extensive testing before releasing this new experience, and as with other features we’ve released in Search, we appreciate your feedback,” the statement said. “We are taking swift action where appropriate under our content policies and are using these examples to develop broader improvements to our systems, some of which have already begun rolling out.”

Limitations of artificial intelligence

AI Atlas Art Badge Tag AI Atlas Art Badge Tag

Google’s struggles with AI reviews are one of the most dramatic examples of the limitations behind today’s AI technologies, and the willingness of big tech companies to aggressively deploy them.

In the nearly two years since OpenAI’s ChatGPT was launched, AI has taken the tech industry by storm. Companies large and small, from Microsoft to the US State Department, are investing heavily in new AI tools designed for everything from summarizing meeting notes to creating images, videos and music from a prompt or a short descriptive sentence.

Read more: Rapid Engineering: What you need to know and why it matters

Although the technology is already changing the way people use computers, including helping them write professional correspondence or better understand their math homework, it still struggles with “hallucinations,” or effectively making up facts that aren’t actually there. are true in an attempt to provide a consistent response. The problem is so widespread that many AI companies now include a clear warning in their respective apps and websites, telling users that the information the AI ​​provides may not be accurate.

Now people are facing this problem with Google search service, the most popular site on the web, used by billions of people to find information every day.

How to hide AI reviews

In response, some Google watchers discovered that tweaking the service’s settings could cause its AI reviews to disappear. But this is not the default how Google will display the results at first. (For CNET’s hands-on reviews of generative AI products, including Gemini, Claude, ChatGPT, and Microsoft Copilot, along with AI news, tips, and explanations, see our AI Atlas resources page.)

Read more: AI Atlas, your guide to today’s artificial intelligence

Still, that hasn’t stopped people from finding weird answers from Google’s search service, including confidently answering that a dog played in the NBA and inventing a new form of measurement that turns 1,000km into tomatoes, ‘one kilotomato’ .

A screenshot that claims 1000 tomatoes equals one kilotomat A screenshot that claims 1000 tomatoes equals one kilotomat

Screenshot by Ian Sherr/CNET

Leave a Reply