Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Nvidia chief executive Jensen Huang has long claimed that the world’s data centres will need to be completely overhauled to handle the demands of generative AI. He argues that it will take $1tn over the next four to five years, essentially doubling the amount already sunk into digital infrastructure, to train and run the new AI models.
Nvidia itself has been the most obvious beneficiary of this. Its vertiginous stock market rise has turned it into the world’s most valuable company.
But a spate of earnings announcements and AI-related deals over the last two weeks has also brought encouraging evidence that the boom sparked by the launch of OpenAI’s generative AI chatbot ChatGPT is spreading. Of course, there is no way of telling whether the wave of spending will be sustainable or large enough to justify the huge run-up in tech stocks, but it has at least brought some comfort for the bulls.
Shares in chipmaker Broadcom, for instance, have jumped more than 20 per cent since it reported its latest AI-induced bump to its sales. Another rise like that and it would join the rarefied group of tech companies valued at more than $1tn, more than six times what it was worth half a decade ago.
Much of the lift has come from the demand for AI accelerators, the chips Broadcom custom-designs for customers like Google to speed up their AI calculations. But its surging growth also points to the more important role that high-speed networking has come to play in data centres.
The massive amount of information needed to train and run AI models has required much faster connections between individual processors, as well as between the different machines running inside data centres. Broadcom CEO Hock Tan estimated that networking will account for 40 per cent of his company’s AI chip sales by the end of this year.
Meanwhile, shares in software maker Oracle, a company that was late to the cloud, jumped 17 per cent after news of a deal to train OpenAI’s large language models on its cloud infrastructure. The deal involves bringing the Azure cloud service of OpenAI’s partner, Microsoft, to a giant new Oracle data centre — a relationship between two of the tech industry’s oldest enemies that would once have been unthinkable.
Elsewhere, even Hewlett Packard Enterprise, which had looked like missing out on the booming demand for AI servers that has lifted rivals Supermicro and Dell, finally caught a break on Wall Street. Its shares rose 24 per cent after earnings as investors reassessed its position in the AI boom.
As news like this has fuelled hopes that the AI boom is spreading to more suppliers, several things are becoming clear. One is that the impact looks like it is widespread, spanning many different parts of the “tech stack” — the hierarchy of components from chips to software that are required to run today’s complex IT systems.
Nvidia is still positioned to be by far the biggest winner. Most of its sales derive not from individual chips but entire servers, often networked together in complete racks. Squeezing out the best performance comes from tweaking every element of these systems to work together, using Nvidia’s own proprietary technologies in areas like networking
Nvidia’s biggest customers are desperate to reduce their dependence on the company and are pushing for new standards in everything from networking to AI software that would allow more competitors to emerge. But those initiatives will take time.
The biggest tech companies are also extending their direct involvement in more parts of the infrastructure required by AI. A key part of Apple’s AI announcement last week, for instance, involved news that it is designing its own servers, reportedly based on in-house chip designs. Apple has already taken control of most of the key components in its handsets: a similar move is likely in the data centre as the demands of AI forces it to bring more of the processing of its customers’ data back to its own facilities.
One result of moves like this is that the business models of suppliers have had to adapt, leading companies like Broadcom to play supporting roles as customers take more control. The so-called “hyperscalers” — the largest cloud companies — are also coming to account for a larger share of overall demand, leading to dependence on a narrower base of big customers. That will increase suppliers’ vulnerability in any downturn. But for now, Wall Street is fixated on just how many tech boats will be lifted by the rising tide of generative AI.
Leave feedback about this