The head of Jon Peddie Research, a major graphics market analysis firm that’s been around for nearly 40 years, suggests that Intel could ax its Accelerated Computing Systems and Graphics Group (AXG). The division has been losing money for years and has failed to provide a competitive product for all of the market segments it serves. Forget the best graphics cards; Intel just needs to ship fully functional GPUs.
$3.5 billion lost
Jon Peddie estimates that Intel has invested around $3.5 billion in the development of its discrete GPU – investments that have yet to pay off. In fact, Intel’s AXG has officially lost $2.1 billion since its formal inception in the first quarter of 2021. Given the track record of Pat Gelsinger, Intel’s chief executive who has cut six companies since the start of 2021, JPR suggests that AXG could be next.
“Gelsinger isn’t afraid to make tough decisions and kill favorite projects if they don’t produce — even projects he may personally like,” Peddie wrote in a blog post. “[…] It was rumored that the party was over and that AXG would be the next group to be dropped. This rumor was denied by Koduri.”
When Intel unveiled plans to develop discrete graphics solutions in 2017, it announced plans to address compute, graphics, media, imaging, and artificial intelligence capabilities for customer and center applications. of data with its GPUs. As a bonus, the Core and Visual Computing Group was supposed to cater to the emerging edge computing market.
Five years into its discrete GPU journey, the company has released two low-end standalone GPUs aimed at low-cost PCs and some data center applications; launched its low-power graphics architecture for integrated GPUs; provided an API that could be used to program CPUs, GPUs, FPGAs, and other computing units; canceled its Xe-HP GPU architecture for data center GPUs; postponed (multiple times) shipments of its Ponte Vecchio compute GPU for AI and HPC applications (the most recent was partly due to the late arrival of the Intel Node 4), and delayed the launch of an Xe gaming GPU -HPG ACM-G11 by about a year.
Considering the delay to market for Intel’s Arc Alchemist 500 and 700 series GPUs and the fact that they will have to compete with the next-generation Radeon RX 7000 and GeForce RTX 40 products from AMD and Nvidia, it is highly likely that they will make it fail. This will obviously increase Intel’s losses.
To chop or not to chop
Given Intel’s AXG track record, the company has spent $3.5 billion with no tangible success so far, says Jon Peddie. For Intel, discrete GPUs are a completely new market that requires heavy investment, so the losses are not surprising. Meanwhile, Intel’s own Habana Gaudi2 deep learning processor has some rather tangible performance advantages over Nvidia’s A100 in AI workloads, a market for Intel’s Ponte Vecchio. This success could tip the balance towards the removal of AXG.
“It’s a 50-50 guess if Intel is going to slow things down and get out,” Peddie said. “If they don’t, the company faces years of losses as it tries to navigate a hostile and unforgiving market.”
Strategic importance of GPUs
While it may make sense for Intel to offload its AXG group and cancel discrete GPU development to reduce losses, it should be noted that Intel is pursuing several strategically important directions with its AXG division in general and the development of Discrete GPUs in particular. The list of development directions includes the following:
- AI/DL/ML Apps
- HPC applications
- Competitive GPU architecture and IP to address customers’ discrete and integrated GPUs as well as custom solutions offered by IFS
- Data center GPU for video rendering and encoding
- Edge computing applications with discrete or integrated GPUs
- Hybrid processing units for AI/ML and HPC applications
Discrete GPU development per se has only been a loss for Intel so far (we wonder how much money the Xe-LP iGPU architecture has made Intel after two years on the market), but it should to note that without a competitive GPU-like architecture that could serve everything from a low-end laptop to a supercomputer, Intel won’t be able to seize many new growth opportunities.
Habana Gaudi2 appears to be a competitive DL solution, but it cannot be used for compute-intensive applications. Additionally, without further evolution of Intel’s Xe-HPC data center GPU architecture, the company will not be able to build hybrid processing units for AI/ML and HPC applications (e.g., Falcon Shores ). Without these XPUs, Intel’s ZettaFLOPS plan by 2027 is starting to look increasingly unrealistic.
Although Intel’s discrete GPU efforts have fallen short of expectations, Intel needs an explicitly parallel computing architecture for many upcoming applications. GPUs have proven to be the best architecture for highly parallel workloads, whether they require low computational precision like AI/DL/ML applications or full FP64 precision like supercomputing applications.
If Intel ends the development of standalone GPUs, it will have to completely rethink its roadmap both in terms of products and in terms of architectures. For example, it will have to find a competitive GPU architecture supplier for its customer processors, because a small internal iGPU development team within Intel will hardly be able to deliver an integrated graphics solution that would be competitive with those offered by AMD. and Apple for their client systems on chips (SoCs).
Intel’s low-key GPU effort may have already cost Intel around $3.5 billion, has so far failed to pay off, and will likely generate further losses. Killing the AXG division seems like an increasingly attractive management move. However, GPUs and derivative hybrid architectures are strategically important for many of the markets Intel serves and for the applications Intel will need to serve in the years to come, so offloading the AXG group seems counterproductive. A lot probably depends on Intel graphics driver issues, but fixing the drivers isn’t a quick fix.
What will Pat Gelsinger do? Maybe we’ll find out sooner rather than later. “Maybe the clouds will lift by the end of this term,” Jon Peddie muses.