You are currently viewing Google’s AI search tool tells users to “eat rocks” for your health

Google’s AI search tool tells users to “eat rocks” for your health

Stay informed with free updates

Google’s new AI search tool has advised users that eating rocks can be healthy and stick cheese on a pizza, prompting ridicule and raising questions about its decision to build an experimental feature into its flagship product.

“Eating the right rocks may be good for you because they contain minerals that are important to your body’s health,” Google’s AI review responded to a query from the Financial Times on Friday, apparently in reference to a satirical article from April 2021 by The Onion with the title “Geologists recommend eating at least one small rock a day”.

Other examples of wrong answers include recommending mixing glue into pizza sauce to increase its “stickiness” and stop the cheese from sliding, which may be based on a joke made 11 years ago on Reddit.

More seriously, to the question “how many Muslim presidents has the US had,” the AI ​​review answers, “The United States has had one Muslim president, Barack Hussein Obama” — repeating a lie about the former president’s religion pushed by some of his political opponents.

Google said: “The majority of AI reviews provide high-quality information with links to dig deeper into the web. Many of the examples we saw were unusual requests, and we also saw examples that were spoofed or that we couldn’t reproduce.

“We conducted extensive testing before releasing this new experience, and as with other features we’ve released in Search, we appreciate your feedback. We take swift action when necessary under our content policies and use these examples to develop broader improvements to our systems, some of which have already started rolling out.

Errors arising from answers generated by Google’s AI are an inherent feature of the systems underlying the technology, known as “hallucinations” or fabrications. The models that power the likes of Google’s Gemini and OpenAI’s ChatGPT are predictive, meaning they work by choosing the likely next best words in a sequence, based on the data they’ve been trained on.

While companies building generative AI models—including OpenAI, Meta, and Google—claim that the latest versions of their AI software have reduced the occurrence of fabrications, they remain a serious problem for consumer and business applications.

For Google, whose search platform is trusted by billions of users because of its links to original sources, the “hallucinations” are particularly damaging. Parent company Alphabet generates most of its revenue from search and its related advertising business.

In recent months, CEO Sundar Pichai has been under pressure, both internal and external, to accelerate the rollout of new user-focused generative AI features after being criticized for lagging behind rivals, notably OpenAI, which has a partnership with Microsoft to 13 billion dollars.

At Google’s annual developer conference this month, Pichai laid out a new AI-centric strategy for the company. It launched Insights – a short Gemini-generated response to queries – at the top of many common search results for millions of US users under the tagline “Let Google search Google for you” and eliminate “search fatigue”.

The issues facing Overviews echo the backlash in February against its Gemini chatbot, which created historically inaccurate images of different ethnicities and genders through its image creation tool, such as women and people of color such as Viking kings or German World War II soldiers .

In response, Google apologized and stopped generating human images from its Gemini model. The function is not restored.

Pichai spoke about Google’s dilemma of keeping up with competitors while acting ethically and remaining the search engine widely relied upon to return accurate and verifiable information.

At an event at Stanford University last month, he said: “People are coming to look at important times, such as the dose of medicine for a three-month-old, so we have to get it right. . . that trust is hard earned and easily lost.

“When we get it wrong, people let us know that consumers have the highest bar. . . it is our north star and where our innovations are directed,” Pichai added. “It helps us make the products better and get them right.”

Leave a Reply