Intel Launches 4th Gen Xeon Scalable "Sapphire Rapids", Xeon CPU Max Series

Written by Michael Larabel in Processors on 10 January 2023 at 01:00 PM EST. Page 1 of 4. 1 Comment.

Intel has announced the 4th Gen Xeon Scalable "Sapphire Rapids" CPUs today along with the Xeon CPU Max Series (Sapphire Rapids HBM) and Data Center GPU Max Series (Ponte Vecchio). Here is an overview of today's announcements prior to getting to some initial Sapphire Rapids Linux benchmarks on Phoronix.

With 4th Gen Xeon Scalable processors, Intel is promoting a 53% general purpose compute improvement over prior Xeon Scalable "Ice Lake" processors, up to 10x better AI inference performance, up to 3x higher performance for data analytics, and up to 2x higher data compression performance for networking and storage. Much of these gains with Sapphire Rapids come from the new onboard accelerators found with these processors. Meanwhile over on the Xeon CPU Max Series side for the Sapphire Rapids SKUs with HBM2e memory, there is up to a 3.7x improvement for memory-bound workloads.

Sapphire Rapids introduces the Intel Data Streaming Accelerator (DSA) that I have been covering from the Linux angle since 2019. Making use of the Data Streaming Accelerator can offload some data movement tasks across the CPU, memory, and caches along with attached memory/storage/network devices.

Sapphire Rapids also has the In-Memory Analytics Accelerator (IAA) for helping accelerate database and analytics workloads while also providing increased power efficiency. The new processors also have QuickAssist Technology (QAT) what has been available in select past chipsets for helping with encryption/decryption/compression tasks.

Very exciting with Sapphire Rapids is the introduction of Advanced Matrix Extensions (AMX) for helping AI workloads in particular to help with deep learning training and inference. Intel AMX is very exciting and will be the focus of some dedicated follow-up benchmark articles on Phoronix in the weeks ahead.

Intel has also improved its AVX-512 implementation with Sapphire Rapids and should come with less negative impact from its use compared to prior generations. Again, something that will be covered in more detail in follow-up Phoronix benchmark articles.

Sapphire Rapids CPUs do have Trust Domain Extension (TDX) support as a feature I've covered from the Linux support side for a while. Interestingly though with Sapphire Rapids this TDX support is only being made available to select cloud providers for helping improve VM security/confidentiality. For next-generation Xeon Scalable CPUs is when it looks like Intel will make TDX more broadly available.

But before getting too far into the Sapphire Rapids features, what most of you are likely most interested in and that is ultimately the SKU table.

The flagship Intel 4th Gen Xeon Scalable processor is the Xeon Platinum 8490H that is 60 cores / 120 threads with a 1.9GHz base clock, 2.9GHz all-core turbo clock, and 3.5GHz maximum turbo clock. The Xeon Platinum 8490H has a 350 Watt TDP, 112.5MB cache, and carries a recommended customer price of $17,000 USD.

The Xeon Platinum 8490H will be competing against AMD's EPYC 9654 processor, which as a reminder is 96 cores / 192 threads, 2.4GHz base clock, all-core boost up to 3.55GHz, and a maximum boost clock of 3.7GHz. The EPYC 9654 has a 360 Watt default TDP. When it comes to the core/thread count and rated clock speeds, the flagship AMD EPYC 9654 processor easily comes out ahead of the Xeon Platinum 8490H... AMD EPYC Genoa also supports 12 channel DDR5-4800 compared to 8-channel DDR5-4800 with Sapphire Rapids. But with the Xeon Scalable "Sapphire Rapids" where Intel aims to compete is for workloads that can be exploited by their various onboard accelerators. Intel heavily is focusing on the accelerator story for Sapphire Rapids.

Generationally, the flagship Xeon Platinum 8490H is a big boost over the prior-generation Ice Lake flagship with the Xeon Platinum 8380. From the 8380 to 8490H is 40 cores / 80 threads to 60 cores / 1280 threads, from 60MB to 112.5MB cache size, 8 channel DDR4-3200 is replaced by 8-channel DDR5-4800, and the maximum turbo frequency went up from 3.4GHz to 3.5GHz. Plus Sapphire Rapids has the new accelerators, AMX, and other goodies as noted. But going form the Xeon Platinum 8380 to 8490H is actually a lower base frequency now at 1.9GHz compared to 2.3GHz with Ice Lake and the TDP is up from 270 Watts to now 350 Watts, but at least there is 50% more cores and other additions. Or as another comparison point there is the existing Xeon Platinum 8380H as 28 cores / 56 threads with 2.9GHz base frequency and 4.3GHz maximum turbo frequency with a 38.5MB cache and 250 Watt TDP.

On the topic of accelerators, looking at the SKU table you will also note around the accelerators there is the "Default" mention for number of DSA / QAT / DLB / IAA devices... That default mention and the "Intel On Demand Capable" columns ultimately go together. As previously disclosed, Intel On Demand (also known formerly as Software Defined Silicon) will tie into the accelerator availability / number of accelerator devices on select SKUs. In the case of the Xeon Platinum 8490H / 8480H models there are all of the devices enabled but as you work your way down the stack is less accelerator devices being available until ultimately no accelerators at all, unless using Intel On Demand for purchasing an upgrade -- beyond the listed recommended customer pricing.

It will be interesting to see how Intel's On Demand model plays out especially with these accelerators being all the more important for being able to compete effectively with the likes of AMD EPYC Genoa and Ampere Altra. Intel says they pursued On Demand from customers wanting to switch some expenses from capex to opex expenditures and allow customers the ability to upgrade over time.

On the SKU table is also the Xeon CPU Max Series with the flagship model there being the Xeon Platinum 9480 with 56 cores but having the on-package HBM2e memory. Unfortunately I don't have my hands on any Xeon CPU Max Series at the moment for what would be very interesting benchmarking with that onboard HBM2e memory.

Anyways, that's the quick overview of the rather complex SKU table representing current Xeon Scalable Sapphire Rapids processors.

The above graphic from Intel further breaks down some of the differences between the Sapphire Rapids Xeon Platinum, Xeon Gold, and Xeon Silver series.


Related Articles