Nvidia CEO Says More Advanced AI Models Will Keep Chip, Data Center Growth Going

AI bubble? What AI bubble? If you ask Nvidia CEO Jensen Huang, we’re in a “new industrial revolution.” 

Huang’s company, of course, makes chips and computer hardware, the “picks and shovels” of the AI gold rush, and it’s become the world’s largest business by capitalizing on AI’s growth, bubble or not. Speaking on Wednesday during an earnings call as his company reported revenue of $46.7 billion in the past quarter, he indicated no sign that the incredible growth of the generative artificial intelligence industry will slow.

AI Atlas

“I think the next several years, surely through the decade, we see really significant growth opportunities ahead,” Huang said.

Compare that with recent comments from OpenAI CEO Sam Altman, who said he believes investors right now are “overexcited about AI.” (Altman also acknowledged that he still believes AI is “the most important thing to happen in a very long time.”)

Huang said his company has “very, very significant forecasts” of demand for more of the chips and computers that run AI, indicating the rush for more data centers is not stopping soon. He speculated that AI infrastructure spending could hit $3 trillion to $4 trillion by the end of the decade. (The gross domestic product of the US is around $30 trillion.)

That means a lot of data centers, which take up a lot of land and use a great deal of water and energy. These AI factories have gotten bigger and bigger in recent years, with significant impacts on the communities around them and a greater strain on the US electric grid. And the growth of different generative AI tools that require even more energy could make that demand even greater.


Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source on Chrome.


More powerful and demanding models

One prompt on a chatbot doesn’t always mean one prompt anymore. A source of increased demand for computational power is that newer AI models that employ “reasoning” techniques are using a lot more power for one question. “It’s called long thinking, and the longer it thinks, oftentimes it produces better answers,” Huang said.

This technique allows an AI model to research on different websites, try a question multiple times to get better answers and put disparate information together into one response. 

Some AI companies offer reasoning as a separate model or as a choice labeled something like “deep thinking.” OpenAI worked it right into its GPT-5 release, with a routing program deciding whether it was handled by a lighter, straightforward model or a more intensive reasoning model.

But a reasoning model can require 100 times the computing power or more than what a traditional large language model response would take, Huang said. These models, along with agentic systems that can perform tasks and robotics models that can handle visualization and operate in the physical world, are keeping demand for chips, energy and data center land on the rise. 

“With each generation, demand only grows,” Huang said.

Great Job Jon Reed & the Team @ CNET Source link for sharing this story.

#FROUSA #HillCountryNews #NewBraunfels #ComalCounty #LocalVoices #IndependentMedia

Latest articles

spot_img

Related articles

LEAVE A REPLY

Please enter your comment!
Please enter Your First & Last Name here

Leave the field below empty!

spot_img
Secret Link