Getty Images/iStockphoto

Nvidia Blackwell GPU production ramps up amid revenue surge

Nvidia's Blackwell GPU production is 'full steam' ahead as revenue surges with AI demand.

Nvidia CEO Jensen Huang addressed investor concerns Wednesday about the Blackwell GPU overheating, stating that the data center AI accelerator is in full production and sales will ramp up this quarter.

Huang took questions from Wall Street analysts after the company reported fiscal third-quarter revenue that surged 94% from the previous year to $35.1 billion, higher than analysts expected. Data center revenue reached a company record of $30.8 billion, a 112% increase from last year.

The company expected revenue to rise to $37.5 billion, plus or minus 2%, in the current quarter ending Jan. 31.

A media report that Nvidia had asked suppliers to redesign server racks because of an overheating problem with Blackwell worried investors, who feared the report would affect future revenue. Huang didn't address the question directly when asked about the story, responding that Blackwell's growing demand would exceed supply for several quarters.

Jensen Huang, CEO, NvidiaJensen Huang

"Production is in full steam," Huang said. "Blackwell is in great shape, and as we mentioned earlier, the supply and what we're planning to ship this quarter is greater than our previous estimates."

Nvidia is the dominant supplier of GPUs that power the massive generative AI models used by the largest cloud providers, AWS, Google and Microsoft. It is also a major supplier of Meta, which owns Facebook.

Demand for the company's chips has driven its valuation to more than $3 trillion. It has fueled nine consecutive quarters of revenue that beat analysts' expectations.

"The first few quarters that I covered Nvidia, I could write many different things, but now it seems like it's almost monotonous in terms of performance," said Alvin Nguyen, an analyst at Forrester Research.

That doesn't mean Blackwell hasn't faced speed bumps. Nvidia unveiled the product in March and promised to ship it in the second quarter; however, a design flaw delayed production by the company's manufacturing partner, Taiwan Semiconductor Manufacturing Co.

With Blackwell on track, Nvidia's next challenge could be navigating a trade war between the U.S and China, which contributed $5.4 billion in revenue in the last quarter. President-elect Donald Trump has threatened a 60% tariff on goods from China, which economists say could retaliate with tariffs or restrictions against U.S. companies doing business there.

"We will, of course, support the administration. That's our highest mandate." Huang told analysts. "We will comply with any regulation that comes along fully and support our customers to the best of our abilities."

In the longer term, Nvidia will have to expand the market reach of its AI accelerators beyond cloud providers and companies with hyperscale data centers, Nguyen said. Surveys show that over the next couple of years, more traditional enterprises will deploy AI applications on specialized models that run on chips far less powerful than Blackwell or Nvidia's previous-generation GPU built on the Hopper microarchitecture.

The expected transition would open the market to competitors, including AMD and Intel.

"If small models that fit in a rack or on four or five servers are good enough, that's a game changer in terms of what you buy and the quantities," Nguyen said.

The shift to running generative AI applications in enterprise data centers will start next year as organizations move from proof-of-concept projects to production, according to Gartner. The analyst firm predicts that generative AI will help triple enterprise server sales globally to $332 billion by 2028.

Antone Gonsalves is an editor at large for TechTarget Editorial, reporting on industry trends critical to enterprise tech buyers. He has worked in tech journalism for 25 years and is based in San Francisco. Have a news tip? Please drop him an email.

Next Steps

Nexla integration with Nvidia NIM boosts AI development

Dig Deeper on AI infrastructure