Intensified competition in artificial intelligence chips
2023-12-13
There is new news coming from the artificial intelligence (AI) industry. On December 6th local time, AMD held an "Advancing AI" press conference and released the highly anticipated MI300 series chip products. The new products can be used to train and run large language models, with better memory capacity and higher energy efficiency than the previous generation products. AMD CEO Lisa Su even described one of the products, the Instinct MI300X accelerator, as "the world's highest performing accelerator.". The rapid development of AI applications is driving the increasingly fierce competition in the chip industry. The Instinct MI300X accelerator consists of 8 MI300X GPUs, which are advanced graphics processors originally designed for graphics rendering and processing. Later, it was discovered that it has strong performance in handling parallel computing and can be used to accelerate various computationally intensive applications. Compared with CPU, FPGA and other chips, GPU is currently the latest generation of AI training chips. Nvidia's "star product" GPU chip H100 sold hundreds of thousands of units in a year. Thanks to this, the company's performance has repeatedly reached new highs, achieving revenue of $18.12 billion in the third quarter, a year-on-year increase of 206%. The data model of artificial intelligence has a great demand for high-performance and high computing power AI chips. It is predicted that the scale of the AI chip industry will exceed 400 billion US dollars in the next four years. Under strong demand, technology giants such as AMD, Intel, and IBM are all developing AI chips, and companies such as Google, Microsoft, Alibaba, and Baidu are also laying out their own self-developed chips. The MI300X chip launched by AMD this time is highly anticipated, with over 150 billion transistors and 2.4 times the memory of Nvidia's H100 chip. It is comparable to Nvidia's H100 chip in training large language models, but performs better in inference. AMD's new products have gained the favor of many major customers, with Microsoft, Meta, Oracle, and others announcing the adoption of AMD's chip products. However, the sales of AMD chip products will still be constrained by a series of factors, such as whether the price is competitive and whether the software is compatible. Other chip giants are also accelerating research and development to launch new products. Previously, Nvidia had already released the next-generation chip H200 designed specifically for training and deploying various artificial intelligence models, which is expected to be shipped in the second quarter of 2024; Intel will focus on improving the HBM (High Bandwidth Memory) capacity of its AI chips in its research and development. It can be seen that with the emergence of large models, the development goals of AI chips have shifted towards high computing power, high flexibility, and low power consumption. With the support of stronger chips, more AI related applications will emerge, bringing surprises to people's lives. (Lai Xin She)
Edit:Hu Sen Ming Responsible editor:Li Xi
Source:XinhuaNet
Special statement: if the pictures and texts reproduced or quoted on this site infringe your legitimate rights and interests, please contact this site, and this site will correct and delete them in time. For copyright issues and website cooperation, please contact through outlook new era email:lwxsd@liaowanghn.com