ABI Research reports that cloud service providers are set to own 18% of total AI cloud chipset market by 2024

ABI Research estimated on Wednesday that cloud service providers commanded 3.3 percent market share of the total AI cloud chip shipments in the first half of 2019. These players will increasingly rely on their own in-house AI chips and will be producing a total of 300,000 cloud AI chips by 2024, representing 18 percent of the global cloud AI chipsets shipped in 2024.

During the last two years, several cloud service providers, including Alibaba, Amazon, Facebook, Google, Huawei, and Tencent, have been busy designing their own in-house chipsets for handling Artificial Intelligence (AI) workloads in their data centers. 

The increasing requirements for intelligent services by many enterprise verticals are pushing cloud service providers to rapidly upgrade their data centers with AI capabilities, which has already created an enormous demand for cloud AI chipsets in recent years. 

In recent years, many companies have started to emerge and offer interesting take on how to address the challenge of AI workload in the cloud. On one hand, new startups like Cerebras Systems, Graphcore, Habana Labs, and Wave Computing have announced new chipsets that have higher performance or better computational flow as compared to conventional chipsets. 

On the other hand, captive vendors have started to build their own AI chips to power their data centers. Examples of these vendors include Amazon, Google, Huawei, Baidu and potentially Alibaba.

Overall, the market size for cloud AI chipsets is expected to be US$3.5 billion in 2018. This is expected to grow to $19.1 billion in 2024. Right now, most of the market share is captured by non-captive vendors. As cloud service providers are going to take away majority of the AI workloads, we believe that their market share will grow from 2.3 percent in 2018 to 9.4 percent in 2024.

For companies to be successful in this sector, the chipset must be highly scalable and flexible, achieve the right balance between performance and power budget, but also feature strong ecosystem support and comprehensive software stack.

ABI Research expects revenues from these chipset shipments to increase significantly in the next five years, from $4.2 billion in 2019 to $10 billion in 2024. Established chipset suppliers such as NVIDIA, Intel, and to a certain extent, Xilinx will continue to dominate the market landscape, thanks to the robust developer ecosystem they have created around their AI chipsets.

However, these players will increasingly face intensive competition from many new entrants and challengers, particularly their clients, namely the webscale companies such as Google, Alibaba, Amazon and Huawei.

These findings are from ABI Research’s Cloud AI Chipsets: Market Landscape and Vendor Positioning application analysis report. This report is part of the company’s AI and Machine Learning research service, which includes research, data, and analyst insights. Based on extensive primary interviews, Application Analysis reports present in-depth analysis on key market trends and factors for a specific technology.

This trend, initiated by Google in 2017, has led to many other webscale companies to follow Google’s track. Baidu immediately followed with its own AI chipset, Kunlun, in 2018, and later in the same year, Amazon introduced its Inferentia chip to support its Amazon Web Service (AWS). 

AWS has strong influence in the AI industry due to the success of SageMaker, its machine learning development platform. Huawei is another captive company that has made a move toward using its in-house chips for its cloud services in an attempt to reduce its reliance on Western chipset suppliers. The company launched Ascend 310 and 910 in 2018 and has since expanded its product lineup into a series of cloud AI hardware, including an AI accelerator card and AI system. Recently, Huawei launched Atlas 900, an AI training cluster which is a direct competitor to NVIDIA’s DGX and features over 1,000 Ascend 910 chipsets.

“The approach by webscale companies to develop in-house AI chips allows for better hardware-software integration and resources tailored to handle specific AI networks, which serves as a key differentiating point not only at the chipset level but also at the cloud AI service level,” said Lian Jye Su, Principal Analyst at ABI Research. “The success of these highly optimized processing units provides strong validation for the emergence of other cloud AI Application-Specific Integrated Circuits (ASICs) startups, such as Cerebras Systems, Graphcore, and Habana Labs.”

“This further expands the footprint of cloud AI service providers, as they are also competing with Intel and NVIDIA for the mindshare of developers. By offering end-to-end AI hardware solutions, Google, Amazon, and Huawei can ensure that their users will enjoy the ease of development and deployment while creating an active and vibrant developer community around their chipset solutions and ultimately generating a large user base for their cloud AI services,” concluded Su.

IoT Innovator Newsletter

Get the latest updates and industry news in your inbox! Enter your email address and name below to be the first to know.