The future of Google Search is AI. But not in the way you think. The company synonymous with web search isn’t all in on chatbots (even though it’s building one, called Bard), and it’s not redesigning its homepage to look more like a ChatGPT-style messaging system. Instead, Google is putting AI front and center in the most valuable real estate on the internet: its existing search results.
To demonstrate, Liz Reid, Google’s VP of Search, flips open her laptop and starts typing into the Google search box. “Why is sourdough bread still so popular?” she writes and hits enter. Google’s normal search results load almost immediately. Above them, a rectangular orange section pulses and glows and shows the phrase “Generative AI is experimental.” A few seconds later, the glowing is replaced by an AI-generated summary: a few paragraphs detailing how good sourdough tastes, the upsides of its prebiotic abilities, and more. To the right, there are three links to sites with information that Reid says “corroborates” what’s in the summary.
Google calls this the “AI snapshot.” All of it is by Google’s large language models, all of it sourced from the open web. Reid then mouses up to the top right of the box and clicks an icon Google’s designers call “the bear claw,” which looks like a hamburger menu with a vertical line to the left. The bear claw opens a new view: the AI snapshot is now split sentence by sentence, with links underneath to the sources of the information for that specific sentence. This, Reid points out again, is corroboration. And she says it’s key to the way Google’s AI implementation is different. “We want [the LLM], when it says something, to tell us as part of its goal: what are some sources to read more about that?”
A few seconds later, Reid clicks back and starts another search. This time, she searches for the best Bluetooth speakers for the beach. Again, standard search results appear almost immediately, and again, AI results are generated a few seconds later. This time, there’s a short summary at the top detailing what you should care about in such a speaker: battery life, water resistance, sound quality. Links to three buying guides sit off to the right, and below are shopping links for a half-dozen good options, each with an AI-generated summary next to it. I ask Reid to follow up with the phrase “under $100,” and she does so. The snapshot regenerates with new summaries and new picks.
This is the new look of Google’s search results page. It’s AI-first, it’s colorful, and it’s nothing like you’re used to. It’s powered by some of Google’s most advanced LLM work to date, including a new general-purpose model called PaLM 2 and the Multitask Unified Model (MUM) that Google uses to understand multiple types of media. In the demos I saw, it’s often extremely impressive. And it changes the way you’ll experience search, especially on mobile, where that AI snapshot often eats up the entire first page of your results.
There are some caveats: to get access to these AI snapshots, you’ll have to opt in to a new feature called Search Generative Experience (SGE for short), part of an also-new feature called Search Labs. Not all searches will spark an AI answer — the AI only appears when Google’s algorithms think it’s more useful than standard results, and sensitive subjects like health and finances are currently set to avoid AI interference altogether. But in my brief demos and testing, it showed up whether I searched for chocolate chip cookies, Adele, nearby coffee shops, or the best movies of 2022. AI may not be killing the 10 blue links, but it’s definitely pushing them down the page.
SGE, Google executives tell me over and over, is an experiment. But they’re also clear that they see it as a foundational long-term change to the way people search. AI adds another layer of input, helping you ask better and richer questions. And it adds another layer of output, designed to both answer your questions and guide you to new ones.
An opt-in box at the top of search results might sound like a small move from Google compared to Microsoft’s AI-first Bing redesign or the total newness of ChatGPT. But SGE amounts to the first step in a complete rethinking of how billions of people find information online — and how Google makes money. As pixels on the internet go, these are as consequential as it gets.
Google feels pretty good about the state of its search results. We’re long past the “10 blue links” era of 25 years ago when you Googled by typing in a box and getting links in return. Now, you can search by asking questions aloud or snapping a picture of the world, and you might get back everything from images and podcasts to TikToks.
Many searches are already well served by these results. If you’re going to Google and searching “Facebook” to land on facebook.com or you’re looking for the height of the Empire State Building, you’re already good to go.
But there’s a set of queries for which Google has never quite worked, which is where the company is hoping AI can come in. Queries like “Where should I go in Paris next week?” or “What’s the best restaurant in Tokyo?” These are hard questions to answer because they’re not actually one question. What’s your budget? What days are all the museums open in Paris? How long are you willing to wait? Do you have kids with you? On and on and on.
There’s a set of queries for which Google has never quite worked, which is where the company is hoping AI can come in
“The bottleneck turns out to be what I call ‘the orchestration of structure,’” says Prabhakar Raghavan, the SVP at Google who oversees Search. Much of that data exists somewhere on the internet or even within Google — museums post hours on Google Maps, people leave reviews about wait times at restaurants — but putting it all together into something like a coherent answer is really hard. “People want to say, ‘plan me a seven-day vacation,” Raghavan says, “and they believe if the language model outputs, it should be right.”
One way to think about these is simply as questions with no right answer. A huge percentage of people who come to Google aren’t looking for a piece of information that exists somewhere. They’re looking for ideas, looking to explore. And since there’s also likely no page on the internet titled “Best vacation in Paris for a family with two kids, one of whom has peanut allergies and the other of whom loves soccer, and you definitely want to go to the Louvre on the quietest possible day of the week,” the links and podcasts and TikToks won’t be much help.
Because they’re trained on a huge corpus of data from all over the internet, large language models can help answer those questions by essentially running lots of disparate searches at once and then combining that information into a few sentences and a few links. “Lots of times you have to take a single question and break it into 15 questions” to get useful information from search, Reid says. “Can you just ask one? How do we change how the information is organized?”
That’s the idea, but Raghavan and Reid are both quick to point out that SGE still can’t do these completely creative acts very well. Right now, it’s going to be much more handy for synthesizing all the search data behind questions like “what speaker should I buy to take into the pool.” It’ll do well with “what were the best movies of 2022,” too, because it has some objective Rotten Tomatoes-style data to pull from along with the internet’s many rankings and blog posts on the subject. AI appears to make Google a better information-retrieval machine, even if it’s not quite ready to be your travel agent.
One thing that didn’t show up in most SGE demos? Ads. Google is still experimenting with how to put ads into the AI snapshots, though rest assured, they’re coming. Google’s going to need to monetize the heck out of AI for any of this to stick.
At one point in our demo, I asked Reid to search only the word “Adele.” The AI snapshot contained more or less what you’d expect — some information about her past, her accolades as a singer, a note about her recent weight loss — and then threw in that “her live performances are even better than her recorded albums.” Google’s AI has opinions! Reid quickly clicked the bear claw and sourced that sentence to a music blog but also acknowledged that this was something of a system failure.