'Cost Of Compute Far Beyond Employees' — Nvidia Insider Blows Lid Off AI's Insane Hidden Price Tag
Nvidia's Bryan Catanzaro highlights the soaring costs of AI compute, sparking industry-wide debate

A senior Nvidia executive has blown the lid off one of artificial intelligence's most closely guarded realities: the cost of compute now far exceeds the salaries paid to his own team.
Bryan Catanzaro, vice president of applied deep learning at Nvidia, told Axios that 'for my team, the cost of compute is far beyond the costs of the employees'. The comments, reported on 26 April 2026, have ignited debate about AI's insane hidden price tag as companies continue to invest heavily in the technology.
The admission from within the firm supplying much of the world's AI hardware has put a spotlight on the enormous operational expenses that often go unmentioned amid the hype surrounding artificial intelligence.
Nvidia's Own Reckoning with AI Economics
The statement is particularly noteworthy coming from Nvidia itself. Mr Catanzaro's team relies on the company's powerful graphics processing units to develop and deploy advanced deep learning models. Yet even here, the expense of compute – the electricity, hardware and data centre resources required – has surpassed payroll costs.
This internal dynamic reflects the sheer scale of modern AI workloads, which demand resources far beyond what was anticipated just a few years ago. Similar patterns are emerging elsewhere in the industry.
Technology firms experimenting with commercial AI tools have found inference costs eating into budgets. Uber's chief technology officer blew through his full 2026 AI budget due to token costs after intensive use of systems such as Anthropic's Claude.
Soaring Investment Meets Productivity Doubts
Big Tech firms announced $740 billion (£547.9 billion) in capital expenditures this year (2026), a 69% increase from 2025, per Morgan Stanley analysis. Much of that money is flowing into AI servers, specialised chips and expanded data centres.
Worldwide IT spending will rise to $6.31 trillion (£4.69 trillion) in 2026, with AI infrastructure a major driver, Gartner forecasts. Yet the economic case for widespread automation remains shaky.
A 2024 Massachusetts Institute of Technology study as per Yahoo Finance determined that AI automation would be economically viable in only 23% of roles where vision is a primary part of the work. In the remaining 77% it was cheaper for humans to continue their work. The technology sector has shed more than 92,000 tech layoffs in 2026 so far.
The Limits of Aggressive AI Adoption
Some leaders have embraced what insiders describe as token-maxxing – pushing AI usage to the maximum even when it inflates expenses. One chief executive spoke of building an autonomous business by scaling intelligence rather than headcount. An Instagram post by a verified technology influencer captured the growing sentiment, noting that AI is getting expensive as some companies now spend more on AI compute than employee salaries.
Nevertheless, reliability concerns persist. AI systems can hallucinate or produce unreliable outputs, requiring human supervision that adds to the overall price. The MIT findings and ongoing layoffs suggest that the rush to replace staff may be premature in many areas. As fresh data on AI infrastructure spending emerges, Mr Catanzaro's remarks from Nvidia serve as a sober reminder of the technology's true costs.
With global AI budgets climbing, the industry is grappling with whether these investments will deliver returns before expenses spiral further. The comments highlight how AI compute costs far beyond the costs of employees are reshaping priorities across the sector.
© Copyright IBTimes 2025. All rights reserved.
























