You are currently viewing Why your Google search is about to look very different (and possibly worse)

Why your Google search is about to look very different (and possibly worse)

Google, possibly feeling that it is lagging behind in the artificial intelligence (AI) arena, has made a number of updates over the past year in an attempt to compete with smaller firms like OpenAI.

The final move is to introduce “AI reviews” to the topics you search. Instead of the usual list of websites, users – starting in the US – may see an AI overview of the topic at the top of their search results, summarizing the topic.

The move is not without controversy. Not because fears of rogue AIs or other speculations about artificial general intelligence (AGI), etc., but for similar reasons that creatives protested large language models and image-generating AIs that were trained on human work.

Before, when you Googled something (if you take a shower in the morning or at night, for example), you would be met with a list of algorithmically selected web pages that you would then click on to learn more. These articles are (usually) written by people, and clicking on them and reading the content allows those people to make money for their work (eg through ad revenue or subscription models) and to be recognized for it. Now you may come across a summary of AI (with all the hallucinations and errors they still produce) made by training on other people’s work.

“AI reviews show links to resources that support the information in the snapshot and explore the topic further. This allows people to dig deeper and discover a variety of content from publishers, creators, retailers, businesses, and more, and use the information they find to advance their tasks,” Google explains about the update. “Google’s systems automatically determine which links appear.”

AI will provide you with a link, but there are concerns that people will simply read the review and leave, cutting off a vital revenue stream for people and companies who write about these topics. While we’re sure tech bros will celebrate this, AI models rely on human-generated text for training. Give them too much AI-generated content to learn from and the whole model can collapse.

“Our main conclusion in all scenarios is that without enough fresh real-world data in each generation of an autophagic circuit, future generative models are doomed to have a progressive decrease in quality (precision) or diversity (recall),” one team looking at such a scenario wrote in a yet-to-be-reviewed paper posted on the arXiv preprint server. “We call this condition a model of autophagic disorder (MAD), which draws an analogy to mad cow disease.”

From the user’s point of view, all you can care about is getting useful information. But the AI ​​models aren’t quite up to the task, still hallucinating regularly. Users report finding these errors with Google’s AI review, including an AI that takes an iconic quote from a literary classic for the mice and the people it was from an episode of a zombie tv show Living Deadclaiming that fish lay their eggs at temperatures that would cook them, and suggesting that fish suck teats that fall into their tank as a nursing practice.

Unfortunately for all Google users who aren’t fans, there is currently no way to turn off the AI ​​assistant. However, you can select the “web” tab (sometimes hidden under “more” in the options) if you only want to see results from people who are hopefully a little more aware of the fact that fish aren’t big suckers. The feature is currently being rolled out in the US, with plans to roll out further to the rest of the world in the coming months.

Leave a Reply