2 min

Tags in this article

, , , , , ,

Samsung has unveiled a new DRAM module it made, based on the CXL (Compute Express Link) interface. The company is touting this as an industry-first. The CXL-based DDR5 memory module is presented in the EDSFF form factor and will be used in server systems to scale bandwidth and memory capacity by a significant margin.

The new module can scale memory capacity to the terabyte level, mitigate latency caused by memory caching, and enable server systems to easily handle machine learning, AI, and other demanding workloads.

One of a kind

The CXL interface is used to push faster speeds and reduce latency in communication between the host processor and other devices (memory buffers, accelerators, and smart I/O). It will also expand bandwidth and memory capacity.

The interface itself is the product of the CXL Consortium which was formed in 2019. Their mission is to address the issue of memory capacity and bandwidth needs in systems that have more processors than average.

These systems are used to process mountains of data for intensive applications like artificial intelligence and machine learning. It would appear the consortium paid off.

Roll-out coming soon

The members of the consortium include companies like Samsung, Intel, and Google, plus other chip and server companies from around the world. Samsung says that besides the CXL hardware, the memory also comes with controller and software technologies (error management, interface converting, and memory mapping.)

With capabilities like that, the main processors (GPUs and CPUs) will recognize the CXL-based memory and use it as the main memory.

The module has been validated for use on Intel’s new server platforms. Samsung is in talks with cloud and data centre providers to roll it out on a much larger scale.