#NNPA BlackPressAfrican American News & IssuesAsiaBypassTechTechnology

Samsung Electronics To Expand AI-equipped Products With “Memory Super Gap” Strategy

The Samsung logo is displayed at the Samsung office in Seoul, South Korea. (Chung Sung-Jun/Getty Images)

SEOUL, South Korea – Multinational conglomerate  Samsung Electronics will expand memory semiconductor product lines equipped with artificial intelligence (AI) engines by collaborating with global companies.

It plans to continue its “memory super gap” by expanding the memory semiconductor ecosystem through the convergence of memory and system semiconductors.


Samsung Electronics unveiled various product lines and application cases that applied PIM (Processing-in-Memory) technology to DRAM memory processes at the “Hot Chips” conference held online on Aug. 24.

“Samsung’s revelations include the first successful integration of its PIM-enabled High Bandwidth Memory (HBM-PIM) into a commercialized accelerator system, and broadened PIM applications to embrace DRAM modules and mobile memory, in accelerating the move toward the convergence of memory and logic, “Samsung said in a press release.

“The HBM-PIM has since been tested in the Xilinx Virtex Ultrascale + (Alveo) AI accelerator, where it delivered an almost 2.5X system performance gain as well as more than a 60 percent cut in energy consumption.”

Processing-in-Memory is a next-generation convergence technology with a system semiconductor processor function, which is necessary for computational operations in memory.

Previously, Samsung Electronics developed the world’s first ‘HBM-PIM,’ which applied AI to HBM2 (High Bandwidth Memory) Aquabolt used HPC (high-performance computing) and high-speed data analysis.

“HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential,” said Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics. 

“Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers.”

At the conference, Samsung Electronics introduced ‘AXDIMM (Acceleration DIMM),’ which applied AI engine to DRAM module, ‘LPDDR5-PIM’ technology, which combines PIM and mobile DRAM, and system application cases.

AXDIMM is the product that expands PIM technology from chip to module and has an AI engine on the DRAM module.

Xilinx has been collaborating with  Samsung Electronics to enable high-performance solutions for data center, networking, and real-time signal processing applications starting with the Virtex UltraScale +  HBM  family and recently introduced our new and exciting Versal HBM series products,” said Arun Varadarajan Rajagopal, senior director, Product Planning at Xilinx, Inc. 

“We are delighted to continue this collaboration with Samsung as we help to evaluate HBM-PIM systems for their potential to achieve major performance and energy-efficiency gains in AI applications.”

It improved performance by installing an AI engine to each rank, a unit of operation of DRAM modules, and maximizing parallel processing.

Also, AI engines enable operation inside DRAM modules to reduce data transfer between CPU and DRAM modules, increasing the energy efficiency of AI accelerator systems.

Samsung Electronics’ AXDIMM is currently being evaluated its performance in the server environment of global customers. 

(With inputs from ANI)

Edited by Saptak Datta and Praveen Pramod Tewari



The post Samsung Electronics To Expand AI-equipped Products With “Memory Super Gap” Strategy appeared first on Zenger News.

Related Articles

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker