Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

AMD reveals next-generation AI chips with OpenAI CEO Sam Altman


Lisa SU, CEO of Advanced Micro Devices, testifies during the hearing of the Commerce, Science and Transport Committee of the Senate entitled “Winning the AI ​​race: strengthening the American capacities of computer science and innovation”, in the Hart building on Thursday, May 8, 2025.

Tom Williams | CQ-Roll Call, Inc. | Getty images

Advanced micro-apparents Thursday, new details on its new generation chips, the Instinct MI400 series, will be sent next year on Thursday.

The MI400 chips can be assembled in a complete server rack called Helios, said AMD, which will allow thousands of chips to be linked together in a way that they can be used as a “Rack” scale.

“For the first time, we have archful each part of the rack as a unified system,” the CEO of AMD Lisa Su during a launch event in San Jose, California said on Thursday.

OPENAI CEO Sam Altman appeared on stage with SU and said that his business would use AMD fleas.

“When you started telling me about the specifications, I said to myself, there is no way, it seems completely crazy,” said Altman. “It’s going to be an incredible thing.”

The AMD rack scale configuration will make fleas to make a user as a system, which is important for most customers of artificial intelligence such as cloud suppliers and companies that develop large language models. These customers want “hyperscal” clusters of AI computers which can extend over data centers entirely and use massive amounts of energy.

“Consider Helios as really a rack that works as a single massive calculation engine,” said SU, comparing it Nvidia Vera Rubin Racks, who is expected to be released next year.

The CEO of Openai, Sam Altman, poses at the top of the action of artificial intelligence (AI), at the Grand Palais, in Paris, February 11, 2025.

Joel Saget | AFP | Getty images

AMD rack technology also allows its latest chips to compete with NVIDIA Blackwell fleas, which are already in configurations with 72 assembled graphics units. NVIDIA is the main and the only Rival of AMD in the GPUs of the Big Data Center for the development and deployment of AI applications.

OPENAI – A notable NVIDIA client – gave comments on the DMA on its MI400 roadmap, said flea company. With the MI400 chips and the MI355X flea-like fleas, AMD plans to compete with Rival Nvidia on Price, with a business manager telling journalists on Wednesday than flea market will cost less to operate thanks to the drop in energy consumption, and that NVIDIA Sub-Value with “aggressive” prices.

Until now, Nvidia has dominated the GPU market in the data center, in part because it was the first company to develop the type of software necessary for AI developers to take advantage of the original chips designed to display graphics for 3D games. In the past decade, before the AI ​​boom, AMD focused on compete Intel In server processors.

Su said that the MI355X of AMD can surpass the Blackwell fleas of Nvidia, despite Nvidia using its CUDA software “owner”.

“He says that we have very strong equipment, which we have always known, but it also shows that open software frames have made enormous progress,” said Su.

AMD actions have been flat so far in 2025, reporting that Wall Street does not yet consider it a major threat to Nvidia’s domination.

Andrew Dieckmann, AMD’s general feeder for GPUs in the data center, said on Wednesday that AMD ia fleas would cost less to work and less to acquire.

“Overall, there is a significant cost of Delta acquisition that we then superimpose on our competitive performance advantage in addition to a significant two -digit percentage savings,” said Dieckmann.

In the coming years, large companies and cloud countries are about to spend hundreds of billions of dollars to build new clusters of data centers around GPUs in order to accelerate the development of cutting -edge AI models. Who understands $ 300 billion This year only in the expected capital expenses of Megacap technological companies.

AMD expects the total ia flea market to exceed $ 500 billion by 2028, although it has not said how much this market can claim – Nvidia has more than 90% of the market today, according to the Analyst estimates.

The two companies have undertaken to publish new IA chips on an annual basis, as opposed to a biannual basis, by stressing how fierce competition has become and how important the technology of LAI PAUces is important for companies like companies Microsoft,, Oracle And Amazon.

AMD has bought or invested in 25 AI companies in the past year, said SU, including the Purchase of ZT systems earlier this yearA server manufacturer that developed the technology that DMA needed to build its systems the size of a rack.

“These AI systems become super complicated and complete solutions are really critical,” said Su.

What AMD now sells

Currently, the most advanced AMD AI chip installed with cloud suppliers is its instinct MI355X, which, according to the company, had started to ship to production last month. AMD said it would be available for rental from cloud suppliers from the third quarter.

Companies that build large clusters of data centers for AI want alternatives to Nvidia, not only to reduce costs and offer flexibility, but also to fill a growing need for “deduction”, or the necessary computing power to really deploy a chatbot application or a geneal application of AI, which can use much more processing power than traditional server applications.

“What has really changed is that the demand for inference has increased considerably,” said Su.

AMD officials said Thursday that they thought their new fleas is greater than inference to that of Nvidia. Indeed, AMD chips are equipped with a more high speed memory, which allows larger AI models to operate on a single GPU.

The MI355X has seven times the quantity of computing power as its predecessor, said AMD. These chips will be able to compete with the B100 and B200 chips of Nvidia, which have been shipped since the end of last year.

AMD said its instincts had been adopted by seven of the 10 largest AI customers, including OpenAI, TeslaXAI and COHERE.

Oracle plans to offer clusters with more than 131,000 MI355X chips to its customers, AMD said.

Officials of Meta said on Thursday that they used processor and GPU clusters of AMD to manage inference for its LLAMA model, and that it plans to buy new generation of AMD.

A Microsoft representative said that he uses AMD chips to serve his AI co -pilot features.

Competition on the price

AMD refused to say how much its chips costs – it does not sell fleas by themselves, and end users generally buy them through a company of equipment such as Apron Or Super micro computer – But the company plans that the MI400 fleas compete on the price.

The Santa Clara Company twin its GPUs alongside its processors and its networking tokens of its Pensando acquisition in 2022 to build its Helios racks. This means a greater adoption of its AI chips should also benefit from the rest of the AMD activities. It also uses open source networking technology to closely integrate its rack systems, called Ualink, against Nvlink owner of Nvidia.

AMD claims that its MI355X can provide 40% more token – one measure of the production of AI – by dollar that the chips of Nvidia because its chips use less energy than that of its rival.

GPUs in the data center can cost tens of thousands of dollars per chip, and Cloud companies generally buy them in large quantities.

AMD AI CHIP activity is still much smaller than that of NVIDIA. He said he had $ 5 billion in AI sales in his 2024 financial year, but JP Morgan analysts expect 60% growth in the category this year.

WATCH: The CEO of AMD Lisa Su:

The CEO of AMD Lisa Su:



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *