Global technology giants accelerating AI layout
2024-04-17
The competition for AI chips is becoming increasingly fierce, and a "computing power war" has begun. Global technology giants are competing to join the self-developed AI supercomputing chip competition, not only to seize the opportunity of AI development, but also to reduce dependence on external chip manufacturers such as Nvidia and save on chip procurement expenses. Artificial intelligence (AI) technology and applications are developing rapidly, and more and more enterprises are accelerating their research and development layout in order to seize a place in the era of artificial intelligence. Compared to the application level, investment in computing infrastructure is more urgent. On April 10th local time, Meta announced the latest version of its independently developed chip MTIA. MTIA is a customized chip series designed by Meta specifically for AI training and inference work. Compared to Meta's first generation AI inference accelerator released in May 2023, the latest version of the chip has significantly improved performance, specifically designed for ranking and recommendation systems in Meta's social software. On April 9th, Google announced that it was manufacturing Axion, a chip based on ARM architecture, specifically for data processing and computing in data centers. Google stated on its official website that Axion can provide industry-leading performance and energy efficiency in scenarios such as information retrieval, global video distribution, and generative AI. Previously, both Microsoft and Amazon began developing custom chips capable of handling AI tasks. Global technology giants are competing to join the competition for self-developed AI supercomputing chips, not only to seize the opportunity of AI development, but also with practical considerations. The model training and inference scenarios required for the development of the new generation of AI have led to a sharp increase in market demand for high-capacity and ultra high speed chips. The price of Nvidia AI chips, which account for more than 70% of the market share, continues to rise, even doubling several times, and still in short supply. In the current situation where AI chips are becoming increasingly scarce, technology giants can develop their own chips to reduce their dependence on external chip manufacturers such as Nvidia and save on the cost of purchasing chips. In addition, compared to general-purpose hardware, technology companies can customize personalized hardware based on their AI models, reducing unnecessary functions to achieve cost reduction and efficiency improvement. According to the latest forecast from market research firm Gartner, the AI chip market size will increase by 25.6% compared to the previous year in 2024, reaching $67.1 billion. It is expected that by 2027, the AI chip market size will be more than twice that of 2023, reaching $119.4 billion. Intel, AMD and other chip manufacturers are also accelerating the launch of better performing AI chips to compete with Nvidia for market share. On the evening of April 9th Beijing time, Intel released the new generation cloud AI chip Gaudi 3 and the sixth generation Xeon scalable processor in the United States, further expanding its AI product roadmap. The competition for AI chips is becoming increasingly fierce, and a "computing power war" has begun. Based on massive computing power, what innovative applications will emerge are worth looking forward to. (Lai Xin She)
Edit: Responsible editor:
Source:
Special statement: if the pictures and texts reproduced or quoted on this site infringe your legitimate rights and interests, please contact this site, and this site will correct and delete them in time. For copyright issues and website cooperation, please contact through outlook new era email:lwxsd@liaowanghn.com