Nvidia, NVDA, CEO Jensen Huang has said that his AI chips are improving faster than Moore’s Law
Nvidia CEO Jensen Huang believes the performance of his company’s AI chips is advancing at a rate that surpasses the historical pace defined by Moore’s Law, the principle that guided decades of computing progress.
“Our systems are progressing way faster than Moore’s Law,” Huang said in an interview with TechCrunch on Tuesday, following his keynote speech to an audience of 10,000 at CES in Las Vegas.
First articulated by Intel co-founder Gordon Moore in 1965, Moore’s Law predicted that the number of transistors on a computer chip would double roughly every two years, effectively doubling the chip’s performance. For decades, this principle spurred rapid advancements in computing capabilities while driving down costs.
While Moore’s Law has slowed in recent years, Huang asserts that Nvidia’s AI chips are advancing at a far quicker pace. The company claims its latest data center superchip is over 30 times faster at running AI inference workloads compared to its predecessor.
“We can innovate across the entire stack—the architecture, the chip, the system, the libraries, and the algorithms—all at the same time,” Huang explained. “When you do that, you can move faster than Moore’s Law.”
Huang’s assertion comes amidst growing discussions about whether AI’s progress is decelerating. Companies like Google, OpenAI, and Anthropic rely on Nvidia’s AI chips for training and running their advanced AI models, meaning improvements to Nvidia’s hardware could directly accelerate advancements in AI capabilities.
This isn’t the first time Huang has suggested that Nvidia is outpacing Moore’s Law. In a November podcast, he described the AI industry as being on a trajectory he called “hyper Moore’s Law.”
Huang dismisses concerns that AI progress is slowing down, emphasizing what he refers to as three distinct AI scaling laws. These include pre-training, where AI models learn patterns from vast datasets; post-training, where models are fine-tuned using methods like human feedback; and test-time compute, which optimizes an AI model’s responses during the inference phase.
“Moore’s Law was pivotal in computing history because it dramatically reduced costs,” Huang said. “The same trend is now happening with inference—where we’re significantly improving performance, and as a result, the cost of inference will continue to decline.”
Huang’s confidence in Nvidia’s progress aligns with the company’s meteoric rise in value, fueled by the AI boom—a development that gives him strong incentives to highlight AI’s rapid evolution.