BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

AMD EPYC Enters Azure Cloud Through Microsoft Project Olympus

This article is more than 6 years old.

AMD and its customer Microsoft made a big EPYC cloud datacenter announcement this week. Microsoft’s new Azure L-Series EPYC storage instance should deliver a lot more performance and storage per dollar to Azure customers, at lower cost per instance to Microsoft. This is due to a combination of AMD EPYC performance and cost structure plus Microsoft’s Project Olympus system architecture and specifications.

AMD also foreshadowed more announcements in the pipeline before the end of the year. Microsoft and Baidu stood onstage with AMD at EPYC launch, perhaps that is a hint.

Azure’s new Ls storage instance preview

This week, Microsoft Azure announced an advanced preview of its new EPYC-based L-Series storage instances. ‘L’ stands for Linux Virtual Machines, and these instances are virtual CPUs with no offload acceleration (no GPU, FPGA, etc.). Microsoft appended an ‘s’ to the instance names to differentiate them from the existing Intel Xeon L-series instances. I’ll call EPYC instances ‘Ls’.

Microsoft’s new AMD EPYC Ls instances have the same number of hardware threads as its Intel Xeon-based ‘L’ instances. Each Intel core uses HyperThreading to execute two simultaneous threads. Microsoft counts each EPYC hardware thread, also two per core, as virtual CPUs (vCPU). However, Microsoft doubled (2x) the amount of memory in each Ls instance over the existing L instance, and almost tripled (3x) the amount of storage per instance. (I translated L instances’ core count into vCPUs in the following table.)

TIRIAS Research

Microsoft has not disclosed pricing for the new Ls instances nor timing for general availability. My uninformed but educated guess is that Ls pricing will match L pricing at a vCPU level. This should deliver a lot more performance and storage per dollar to Azure customers, at lower cost per instance to Microsoft. Storage capacity is getting less expensive over time, but Intel processor average selling prices (ASPs) were increasing. AMD EPYC changes that dynamic completely.

The role of Project Olympus

A big part of lowering the cost of AMD’s hardware platform is Microsoft’s Project Olympus donation to the Open Compute Project (OCP). Lowering hardware costs does not rest completely on the shoulders of EPYC vs Xeon price differential. Microsoft’s brilliant move is simply to donate modular “plug and play” standards for equipment they actually plan to buy and deploy at scale in its Azure datacenters–something Facebook never really managed to do with OCP.

TIRIAS Research

Microsoft selected four processor vendors as Azure suppliers earlier this year, pictured above. The net effect on the supply chain for AMD, Cavium and Qualcomm has been stunning. Each of them created OCP Project Olympus compatible reference designs, and all the design variants I am seeing in the market are Olympus-compatible derivatives of those reference designs. Project Olympus designs fit into a wide variety of competing rack form factors. Microsoft can buy and deploy any of them. It gives Microsoft a huge supply chain advantage over cloud giants who specify unique designs. And it gives AMD and the other two processor vendors a couple of high-volume motherboard designs (full width and half width) that can be leveraged among many customers, lowering the cost to deliver each of their unique value propositions.

Intel’s dominant market position has resulted in extreme platform proliferation. Intel uses customization as a lever to give cloud giants exactly the motherboards they want–but at a price. Microsoft’s new Azure purchasing model turns that customization into a disadvantage. I would be surprised if Intel converted a sizable part of its cloud supply chain to Project Olympus compatible designs. And that means that not only are Intel’s processors more expensive than its new competitors, but so are its motherboards.

The Project Olympus compatible full-width, dual-socket Ls motherboard sports two high-end EPYC7551 32-core processors, with a core frequency of 2.2Ghz and a maximum single-core turbo frequency of 3.0GHz.

TIRIAS Research

Project Olympus motherboards have PCIe risers and are very extensible in terms of adding storage, networking and compute offload capabilities, even in shallow 1U chassis form factors.

Take-Aways

Microsoft says its L-Series instances–both L (Xeon) and Ls (EPYC) are optimized for high disk throughput and I/O, and “ideal” for Big Data analytics, SQL and NoSQL databases (such as Cassandra, MongoDB, Cloudera and Redis). AMD notes that Apache Spark will also run well in the larger Ls memory footprints.

The key to this Azure advanced preview of an AMD EPYC-based instance is that it starts Microsoft’s purchasing engine for a very flexible motherboard. AMD EPYC will have value in other workloads, and now it is much easier for Microsoft to create new EPYC-based instances, because it is only a matter of buying more equipment. Deploying EPYC for other Azure workloads does not carry the additional time and expense of qualifying other designs.

TIRIAS Research doesn’t see any challenges to Microsoft broadening its relationship with AMD. Given that AMD is the only x86 instruction set alternative to Intel, we believe it is likely that Microsoft will eventually deploy EPYC will to run cloud instances of Windows Server applications.

Microsoft support for EPYC in Azure is noteworthy all by itself, and so is HPE’s support for EPYC in the ProLiant DL385 server. Adding another CSP (Baidu?) to AMD’s customer list by the end of this business year (now only two weeks away) will be truly impressive.

This Microsoft announcement and HPE’s support for EPYC in its ProLiant DL385 and in a Cloudline model show that AMD is getting vendor traction with EPYC. Market traction will become evident in 2018. We’ll see how these design wins affect AMD’s financials in the next six to 12 months.

-- The author and members of the TIRIAS Research staff do not hold equity positions in any of the companies mentioned. TIRIAS Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud.

Follow me on Twitter or LinkedInCheck out my website