In early February, first Google, then Microsoft, announced major changes to their search engines. Both tech giants have spent heavily on building or buying generative AI tools that use large language models to understand and answer complex questions. Now they’re trying to integrate them into search, hoping they’ll give users a richer, more accurate experience. Chinese search company Baidu has announced that it will follow suit.

But the fascination with these new tools may be hiding a dirty secret. The race to create high-performance AI-powered search engines is likely to require a dramatic increase in computing power, and with it, a significant increase in the amount of energy needed by technology companies and the amount of carbon emissions.

“There are already huge resources involved in indexing and searching online content, but implementing artificial intelligence requires a different kind of firepower,” says Alan Woodward, a professor of cyber security at the University of Surrey in the UK. “It requires computing power as well as storage and efficient retrieval. Whenever we see a step change in online processing, we see a significant increase in the power and cooling resources required for large processing centers. I think this could be such a step.”

Training large language models (LLMs), such as those underlying OpenAI’s ChatGPT, which will run on Microsoft’s augmented search engine Bing and Google’s equivalent, Bard, means parsing and inferring relationships in vast amounts of data, so they sought be developed by companies with significant resources.

“Training these models requires enormous computing power,” says Carlos Gómez-Rodriguez, a computer scientist at the University of Coruña in Spain. – Now only large technology companies can teach them.”

Although neither OpenAI nor Google have disclosed the computing cost of their products, a third-party analysis by the researchers shows that training GPT-3, on which ChatGPT is partially based, consumed 1,287 MWh and resulted in emitted more than 550 tons of carbon dioxide equivalent – as much as one person makes 550 round trips between New York and San Francisco.

“It’s not that bad, but then you have to consider it [the fact that] you have to not only train it, but execute it and serve millions of users,” says Gomez-Rodriguez.

There’s also a big difference between using ChatGPT, which investment bank UBS estimates has 13 million daily users, as a standalone product, and integrating it into Bing, which handles half a billion searches every day.

Martin Bouchard, co-founder of Canadian data center company QScale, estimates that, based on his reading of Microsoft and Google’s search plans, adding generative AI to the process will require at least “four to five times more computation per search” . He notes that ChatGPT is currently shutting down its understanding of the world at the end of 2021 as part of an effort to reduce computing requirements.

This will need to change to meet the demands of search engine users. “If they’re going to retrain the model frequently and add more parameters and stuff, that’s a whole different scale of things,” he says.

Source by [author_name]

Previous articleNorth American companies celebrate another record year for robot orders – One America News Network
Next articleRevolutionary prosthetics and new problems of the pandemic