Google is searching for an answer to ChatGPT, Bloomberg reports

For more than two decades, Google Search has ruled the web

Desk Report

Publisted at 5:46 PM, Tue Mar 25th, 2025

One day in 2021, Google’s web search team presented leadership with what was, at the time, a novel proposal: Rather than just have the search engine serve up its familiar list of links, have a chatbot greet visitors at the search results page and offer to answer questions directly. This wasn’t necessarily a shocking idea. Chief Executive Officer Sundar Pichai had been talking for years about redesigning Alphabet Inc., Google’s parent company, around artificial intelligence, and the organization ran DeepMind and Google Brain, two of the world’s most sophisticated AI labs, reports Bloomberg. 

Still, the team’s management bristled at the proposal, according to a former employee with direct knowledge of those conversations. Googlers rarely suggested tinkering with the search engine’s fundamental design. “It was self-regulation. People just weren’t daring to think the thoughts,” the former employee says. The division’s leadership worried that its latest AI, though promising, wasn’t accurate enough. And even if it worked perfectly, answering users’ questions with AI risked upending Google’s core business of mixing so-called organic links with a healthy dose of targeted ads. The idea died, at least for the time being.

For more than two decades, Google Search has ruled the web. It serves as the primary gateway to the internet for billions of people—it currently processes almost 200,000 queries each second, according to the digital marketing company Semrush Holdings Inc. Roughly two-thirds of all web traffic referrals come from the search engine. Search is also Google’s beating heart, generating more than $198 billion in revenue in 2024, almost 60% of Alphabet’s annual sales.

The machine is still humming away, but a chorus of discontent has been building among the web-going public in recent years. Users complain that Google results are increasingly larded with advertising and self-serving features. Its power over the web also means a substantial portion of the internet has been designed primarily not for human consumption but for Google’s own web scrapers. Junky sites with poorly researched listicles or aggregated product reviews seize prominent space in results, frustrating users and grabbing ad revenue from more useful websites less versed in search engine optimization. Tech critics (and lawyers representing the federal government in antitrust litigation) have been arguing that Google’s continued dominance in the face of such shortcomings is proof that the search market is no longer competitive.

Then in 2022, OpenAI introduced something new. ChatGPT bore a notable resemblance to the 2021 proposal Google had rejected. Like the original version of Google, OpenAI’s chatbot provided a simple field for entering text and not much else. The results it spit out didn’t display ads above the real answers or offer links to long-winded recipe sites where multiple autoplaying videos made it hard to concentrate on the steps to make a chickpea salad. And even though its answers weren’t always right, the sheer novelty meant users gave OpenAI a level of grace they might not have extended to the long-running leader of the search world.

One thing that really stung about ChatGPT’s rise was that it was built on Google’s own inventions. OpenAI’s chatbot uses an AI architecture that Google detailed in a now-legendary research paper published in 2017. The breakthrough, a system known as a transformer that helps AI models zero in on the most important pieces of information they’re analyzing, was free for all to use. That Google’s engineering team had woven the technology into search only in the safest of ways showed how much the company struggled to translate its AI breakthroughs into substantial consumer products.

Around the time ChatGPT arrived, pushing through this inertia became the job of Elizabeth Reid. A veteran Googler, Reid joined the search team in 2021 and took over the unit in March 2024. Since then she’s ushered in some of the biggest changes to Google’s core product in years—most notably AI Overviews, which cedes the most prominent space on the search results page to AI-generated responses. In March the company said it will begin experimenting with “AI Mode,” a dedicated tab on its homepage that offers a chat-based search experience similar to what it had rejected four years ago.

Reid refers to her approach as a “constant evolution” rather than a complete overhaul. Her team is still struggling to define the purpose of Google Search in this new era, according to interviews with 21 current and former search executives and employees, most of whom requested anonymity to avoid straining professional relationships, plus more than two dozen other people in the tech and media industries.

In the meantime, multiple independent web publishers say their traffic has been falling. They say AI Overviews poses a particular challenge because it presents information directly on Google’s own results pages that users previously would have gotten by clicking through to the websites where it originated. In February the online education company Chegg Inc. sued Alphabet, saying the search feature was cribbing Chegg’s own content, posing a dire warning to the company. Google’s conduct “threatens to leave the public with an increasingly unrecognizable internet experience, in which users never leave Google’s walled garden and receive only synthetic, error-ridden answers,” Chegg said in its suit. José Castañeda, a Google spokesperson, responded to the suit by saying the company would “defend against these meritless claims.”

AI’s impact on Google itself is just beginning to show. Its search engine is one of the most profitable technologies ever developed, and, more than two years after ChatGPT’s debut, there’s little evidence that this is changing, though some analysts anticipate slower search revenue growth in the coming years. The company made more than $200 billion in gross profit last year.

Still, Google is acting with urgency. Shortly after OpenAI released ChatGPT, Google reassigned more than 1,000 engineers, about 20% of the search engineering team, to generative AI efforts (albeit with only vague marching orders), according to a former Google employee. In an interview with Bloomberg Businessweek, Pichai, who’s said that AI is a bigger deal than fire or electricity, says the world is on the verge of a radical transformation in the way it interacts with information. “I think we are at 1% of what humanity’s information needs are today,” he says. “It’ll be obvious a decade or 20 years from now. And I think we are underestimating how early all of this is.” And so this is an existential moment for Google. It may also be an existential moment for the web itself.

Prior to ChatGPT, Google’s search team was deep into what Arvind Jain, who held the title of distinguished engineer at Google until 2014, called “maintenance mode.” Thousands of engineers tended to an internal code base to keep the profits rolling in. It was vast and full of relics from years past, including 100,000 lines of code to make a feature that allowed people to vote in the 13th season of American Idol. (The initial code base for Uber Technologies Inc. numbered about 10,000 lines.) Search engineers fought over the currency of the realm: latency, or how long a web page takes to load. New features risked increasing load times, so Google invented a system that one former manager likened to cap-and-trade schemes for carbon emissions. To release new projects, teams first had to show they’d reduced latency elsewhere, sending engineers on missions to make unrelated parts of Google Search slightly faster.

Other times they toiled on projects they knew were pointless. In 2020 a former Google manager was charged with helping a team of dozens of engineers who were working on a project to make web search infrastructure more efficient. About six months in, a vice president let slip that he had no intention of releasing the project. But he advised the manager to keep going so the engineers would have something to show their bosses at the end of the year. Not everyone at Google hated such assignments. One former employee says people sometimes found it relaxing to work on projects for which everyone knew the stakes were low. But the barriers to introducing products hurt morale among engineers and sparked tension between management and rank-and-filers. In 2021, Manu Cornet, a search engineer who was also Google’s resident cartoonist, captured the dynamic by sketching an image of a massive cargo ship with cannons, towers and cranes grafted on top of an aging hull patched with duct tape. Surveying the horizon from the deck, the captain remarks, “Poor execution speed. The rowers need a culture shift.”

Google did sometimes push the bounds of search, as it did in 2016 with Google Assistant, a feature to field simple voice commands, such as checking the weather or finding out who’d won the Warriors game. Many Googlers wanted to push this further, but momentum was tempered by uncertainty about how the product would work with its existing advertising-based business model, according to two people familiar with the matter, one of whom framed the debate as an early taste of the angst that generative AI would bring. “How the f--- do we put an ad on it?” the former manager says of voice search. “That’s when the real crisis started.”

Google’s role as a web indexer insulated it from the unreliability of the open internet: Because it was simply pointing to other sites, its users were less likely to blame it for things they found there. The company was periodically criticized for serving as a distribution system for scams and offensive content. But the reputational risks would clearly increase if Google began providing more information directly.

Google has been struggling with the implications of making that shift for more than a decade. In 2012 the company balked at releasing the Knowledge Graph, a collection of important facts about the world. The database, assembled by pulling information from sites Google scraped for search, could help the company respond to queries with direct answers and photographs. After months of work, it determined it had reached more than 95% accuracy, according to a former employee. But the product provided incorrect information in an internal presentation, and executives refused to give the green light. These hesitations would slow the company’s response to generative AI as well. “They had the burden of, Google speaks the truth,” says Jain, the former distinguished engineer.

Engineers eventually met Google’s bar, and the product went ahead that same year. Once the tool was public, the dilemma switched from its accuracy to its impact on the economics of the web. The sites that Google pulled the information from relied on its search engine to send them web traffic and, often, ad revenue. Later, in internal meetings, Google executives argued the data was fair game because Google credited sources such as Wikipedia in small print beneath its answers, according to one of the former employees.

Googlers were well aware of the tension between the preferences of users, who liked getting information quickly, and the needs of websites that produced that information. “I don’t know what the right answer is,” Cornet says. “I would say that, at least as an employee, I felt like the focus on the user was a good enough reason for me to think that Google wasn’t trying to do anything nefarious—even though it may put some companies out of business.”

On the flip side, Google’s business model also supported services that were distinctly not useful for users but were tuned to extract ad revenue, like scammy product affiliate sites and clickbaity news aggregators. Within the past several years blog posts began appearing about the declining quality of Google Search. Users started trying to avoid low-quality sites with tricks such as appending the term “Reddit” to their queries in the hope of locating threads where the information was coming from real people.

It’s hard to empirically measure the quality of something as vast and ever-changing as Google Search. But academics who’ve studied the subject say quality has notably declined. “I think there really is a feeling of decay that is just widely felt,” says Emma Lurie, a doctoral candidate at the University of California at Berkeley who’s been studying search engines since 2017.

Some Googlers bristled at these complaints. Representatives for Google point to independent assessments determining that its results are higher quality than other search engines. “People have high expectations for search and what it does for them,” says Pandu Nayak, Google’s chief search scientist. “And when it delivers on those high expectations, they don’t notice it because it’s just working the way it should.”

Even within the company, criticism was mounting that Google was being guided by the wrong incentives. There’s natural tension between the search unit, which worked to produce the most useful results to users’ queries, and the advertising division, which looked to maximize the revenue those queries produced. To keep the priorities of the advertising division from distorting organic search results, those two divisions had traditionally been separated. Some Googlers felt the firewall was weakening as growth leveled off, according to two former employees who’d worked in search. In early 2019, Google declared a “Code Yellow,” because it might not meet its goals for search revenue for the quarter, according to documents unearthed in the US Department of Justice’s 2023 antitrust trial over Google’s search engine, which ultimately resulted in a federal judge’s ruling that Google maintained an illegal monopoly in search. (The company has said it will appeal.)

As part of the Code Yellow emergency, engineers from Google’s search and Chrome browser teams were reassigned to figure out why user queries had slowed. This trend spelled trouble for Google’s advertising business, because each query represents an opportunity to display a targeted ad. But the actions Google took to address this problem made then-search chief Ben Gomes uncomfortable. “I think it is good for us to aspire to query growth and to aspire to more users. But I think we are getting too involved with ads for the good of the product and company,” Gomes wrote in an email made public during the trial.

Google called off the Code Yellow seven weeks after it began, with Prabhakar Raghavan, then head of Google’s advertising division, praising “heroic” engineering for helping the company reach its revenue goals despite the slowdown in queries. Shortly thereafter, Gomes shifted into a new role in the educational division, and Raghavan became head of both search and ads—further eroding the divide in the eyes of some Googlers. When asked about such concerns, Pichai says that “commercial information is information, too,” and that advertising can be valuable as long as it’s clearly identified. “The true north is the users,” he says. “I think focusing on the users and focusing on quality ends up being the approach by which we will do it all.”

The 2019 concerns about query volume would seem quaint compared with the reaction to ChatGPT. Reid had joined the search team only 19 months before and was still learning how it differed from her previous posting at the company’s Maps division. “It’s like you’re the next-door neighbor who was always in the house, but not after bedtime,” Reid says. She pantomimes inspecting the depths of a closet. “You still have a lot to learn.”

Some people who worked at Google when ChatGPT arrived describe a panic sweeping through the company. But, recalling the moment as she guides a pair of Businessweek reporters around the Googleplex in Mountain View, California, while dressed in an outfit based on Google’s rainbow palette, Reid downplays the idea that the company was shaken by the news. There were plenty of people at Google old enough to remember when Microsoft Corp.’s 2009 release of the Bing search engine was seen as an existential threat. (It wasn’t.) Business as usual had generally worked, and some weren’t inclined to rock the boat. Reid, though, was ready to implement real changes. “She is very data-driven,” says Brian McClendon, who worked with her on Google Maps and is now a senior vice president at Niantic Inc. “She would not make a change based on hope, but if she believed she had the data, that this other way is better, she’d be a steamroller to get there.”

Reid has been rising quickly ever since joining Google in 2003, shortly after graduating with a degree in computer science from Dartmouth College, not far from her New Hampshire hometown. She worked in the New York office, where she gained a reputation for being both creative (she sewed her own wedding dress) and meticulous (she checked her team’s code to a level of detail unusual for most managers). In her new role, she’s one of the most important people at the company other than Pichai and enigmatic founders Larry Page and Sergey Brin.

Earlier in her Google career, Reid had worked on an early version of local search, a Maps feature that allowed people to limit their search to a geographical area. Brin, who oversaw the launch, pushed the team to release it even though they hadn’t yet completed building the ideal technical infrastructure. This sort of thing would never fly in the search division today. It was also the right call, according to Reid. “We learned what people really wanted two months faster,” she says. She describes it as a lesson in “how you experiment when you’re trying to reinvent what’s possible.”

Reid has been attempting to bring that flexibility to her new post. Her team rolled out Search Labs, where enthusiastic users could sign up to try unreleased features, giving the company a way to get user feedback on generative AI experiments before rolling them out to the general public. Reid predicts that the traditional Google search bar will become less prominent over time. Voice queries will continue to rise, she says, and Google is planning for expanded use of visual search, too. Rajan Patel, a vice president for search experience, demonstrated how parents can use Google’s visual search tools to help their kids with homework, or to surreptitiously take a photo of a stylish stranger’s sneakers in a coffee shop to buy the same pair (something Patel did recently). The search bar isn’t going away anytime soon, Reid says, but the company is moving toward a future in which Google is always hovering in the background. “The world will just expand,” she says. “It’s as if you can ask Google as easily as you could ask a friend, only the friend is all-knowing, right?”

Another step in that direction was Google’s recent announcement of AI Mode, which allows users to explore topics in a conversational fashion and ask follow-up questions. Robby Stein, a vice president of product for Google Search, framed the feature as a way for users to explore complex questions that are a poor fit for traditional keyword search; the company’s internal testing showed queries doubling in length. It’s also a chance for Google to flirt with a new business model. AI Mode will roll out first to users who pay for a subscription to Google’s premium AI features, a subtle but significant shift for a search engine that has always been free.

As AI became more prominent, Google began to lose employees to OpenAI, Anthropic and other startups that were moving faster and building more novel products. But, according to several people familiar with the company, morale at the search division has improved as it has begun to feel more dynamic under Reid’s leadership.

On earnings calls, Pichai has touted the company’s progress in bringing down the costs of delivering AI answers. Brin has again become a regular presence in Mountain View, and he personally recruited his longtime colleague Noam Shazeer, one of Google’s most storied engineers, to rejoin the company, Jain says. “What I have seen is renewed energy,” he says. “Those early engineers are getting together, and now they have a target. There’s somebody to catch up with.”

Reid argues that Google is now on track. “Things start slowly and then quickly. Suddenly the combination of the tech and the product and the use and the understanding and the polish and everything comes together, and then everyone needs it,” she says on her walk around the Googleplex. “It’s really exciting to work on search at a time when you think the tech can genuinely change what people can search for.”

As she exits the Maps campus in the fading afternoon light, Reid walks a winding path through a grove of redwoods, gingerly stepping aside to accommodate a Googler on a company-issued bike. Caterers in neatly pressed uniforms set the stage for an alfresco happy hour. It almost feels like the glory days of the mid-2010s, when Google was beloved by internet users and invincible in the market.

Google’s executives weren’t entirely wrong several years ago when they saw risk in building products that might give bad information in Google’s own voice. AI Overviews, for instance, has parroted the false claim that Barack Obama is a Muslim, given outdated information about federal student loan policies and suggested that people eat one rock per day for health reasons. Skepticism about the reliability of AI-powered chatbots was already widespread, and Google’s willingness to accept such “hallucinations” could be read as an ominous sign of how much it had lowered its standards to keep pace with other AI companies.

To Google, the answers were seen as embarrassing but not an indication of some fatal weakness in AI. Company representatives also said the examples ricocheting through social media were primarily the result of people bombarding the product with odd queries. Before the advent of generative AI, “people did not go to Google Search and say, ‘How many rocks should I eat per day?’ ” Reid says. “They just didn’t.” She says she doesn’t think Google could have done any better with the release.

Google’s generative AI products still carry disclaimers that the technology is experimental. Testing tools in public helps them get better, Reid says. She’s convinced that, as with other changes to search, AI will get people to use Google even more than they did before.

Some of the strongest reactions are coming not from users but from that other core Google constituency: independent website publishers. Since it became big enough to matter, Google has been in a delicate dance with the people making the sites, videos and articles that its products help surface. The company has consistently—and accurately—described its search as a vital way for websites to attract new users. But it’s also steadily been providing more information on the search page itself, obviating the need for users to click through.

Supporting publishers has always been incidental to Google’s larger aims, according to one former senior executive. “Giving traffic to publisher sites is kind of a necessary evil. The main thing they’re trying to do is get people to consume Google services,” the former executive says. “So there’s a natural tendency to want to have people stay on Google pages, but it does diminish the sort of deal between the publishers and Google itself.” Google contests this characterization, but AI Overviews may take it to its logical conclusion. Because Google’s AI is trained on the web, its answers are a kind of remixed version of the web itself, while also distorting the economy that gave rise to the modern internet by potentially cutting the sites out of the equation entirely.

This is a real problem for publishers such as Emily Henderson, an interior designer who in the early 2010s built a booming online business by blogging about how to choose just the right-size rug for a living room, or how far to place the coffee table from the couch. Her posts often landed among the top results on searches about home decoration. This drove traffic to her site, which in turn made her subsequent posts more likely to show up high in Google searches.

In 2024, after the introduction of AI Overviews, Henderson was dismayed to find Google’s AI sharing what appeared to be her decor guidelines as its own. Her traffic has dipped since May 2024, when AI Overviews was released. Although she’s still working to understand what this means for her business, she’s concluded she’ll have to find another way to attract readers. “I loved the internet how it was, and I’m just a little fearful that we’re going to lose people,” she says. “It’s just going to continue to get less human and more generic.”

Website operators are also encountering new competition from AI-generated sites designed to game Google’s search algorithms. As of March 2025 the news-rating group NewsGuard Technologies Inc. identified 1,254 news sites with content written largely or entirely by generative AI, up from 49 sites in May 2023. Dozens of AI content farms also appear on Google News, where they compete with traditional news outlets for clicks, according to NewsGuard. And Google’s advertising business places ads on hundreds of AI-generated sites, sharing in the income they generate. In some cases, major brands’ ads have appeared on spammy AI sites, as when marketing messages from the American Red Cross, Charles Schwab and Google itself showed up on news content farms with names such as What’s New 2 Day and Time News.

A Google spokesperson says that AI-generated content isn’t necessarily against the rules but that it takes action against low-quality sites and those that do violate its policies. “What generative AI did was give people a free and really easy-to-use tool to automate and scale spam techniques in a really big way,” says Lily Ray, senior vice president for search engine optimization strategy at the marketing agency Amsive LLC. At the same time, she says, the standard practices that websites engaged in to make their sites legible to Google and therefore likely to become more visible in search are becoming less effective because of constant changes to Search’s ranking algorithms. She says the relationship between Google and publishers is as tense as it’s been at any point in the past two decades.

Reid says that Google cares deeply about publishers and that AI Overviews is a jumping-off point for users to conduct further research on the open web. Pichai, for his part, stresses the need to send “high-quality” traffic to websites, instead of making users click around on sites that may not be relevant to them. “We are in the phase of making sure through this moment that we are improving the product, but in a way that prioritizes sending traffic to the ecosystem,” he says, adding, “That’s been the most important goal.”

In October, Google convened a group of about 20 website creators at its Mountain View headquarters, in what it called the “Web Creator Conversation Event.” The company invited product reviewers, travel bloggers, and entertainment and lifestyle writers, and set up a series of meetings with its search engineers. “They told us that our content and our websites are exactly the kind of thing that Google wants to reward and that they need to do better,” says Gisele Navarro, the managing editor for HouseFresh, a website that specializes in in-depth reviews of air purifiers.

The engineers asked the creators for tips on how to recognize when content was made by humans rather than generative AI, according to three people who were at the meeting. But Nayak, Google’s chief search scientist, also advised publishers to take advantage of AI tools to help them write content, they said.

At one point an attendee pleaded with Google to roll back some of the updates it had made to search, arguing that it must want to remedy the harm it was doing to its partners. But Navarro remembers a Google engineer saying the company couldn’t make piecemeal adjustments to core changes to the algorithm. Since then some attendees have folded, and others are struggling to adjust. Google framed it as a broad conversation about the changing nature of search, but the message in the mind of at least some creators was starker, according to Navarro: “Something they told us in that meeting was to never expect to go back to our old levels of traffic,” she says, “because search has changed.”


The report is taken from Bloomberg. 

related news