What you’ll learn:
- What is CXL and why is it important?
- Why CXL 2.x memory support is the current product focus.
- How CXL addresses latency and other scaling issues.
Though the CXL 3.1 standard is available, it’s the CXL 2.x products making a splash now. The standard allows for large-scale memory pooling to work across products.
I talked with Anil Godbole, Xeon CXL Strategy & Marketing Manager at Intel Corp. and Co-Chair of the Marketing Workgroup at the CXL Consortium, about CXL 2.x’s capabilities and the kinds of products that are currently on the market.
CXL is built on PCI Express (PCIe). PCIe 5 was the first version that supported CXL, and now PCIe 6 is available. CXL is a cache-coherent, load-store, memory-style interface that differs from the PCIe’s peripheral interface protocol, which is oriented toward peripheral devices like Ethernet adapters.
CXL Dives into Memory Pooling
CXL provides a number of services, but the memory-pooling capability is one aspect that’s garnered lots of support from processor and memory vendors (see figure). A CXL memory controller can be the front end to a collection of memory devices like DDR4 or DDR5 DRAM. The controllers are connected to processors via the CXL/PCIe switch fabric. The same interface can be used for PCIe peripherals as well.
Multicore processors have already used a NUMA-style tiered memory architecture. CXL simply moves this from proprietary interfaces to a standard interface that gives multicore servers access to memory ranging from terabytes to petabytes. Latency is on the order of 200 ns depending on many factors, from where you are in the hierarchy, to the delay through the memory controllers, to the memory itself.
CXL memory solutions are showing up in PCIe cards and modules. They typically incorporate standard DRAM modules. Storage isn’t restricted to DRAM, but using non-volatile memory like flash memory often requires additional cache memory.
CXL memory pooling obviously targets massive hyperscaling and supercomputing systems. However, even high-end servers and workstations can benefit from scalable memory expansion. Applications employing artificial intelligence and machine learning can require large amounts of memory that would exceed the capabilities of conventional processor memory interfaces.
Check out more of our FMS 2024 coverage. Also, read more articles and view more videos in the TechXchange: CXL for Memory and More.