The Landscape of Compute-near-memory and Compute-in-memory: A Research and Commercial Overview

The Landscape of Compute-near-memory and Compute-in-memory: A Research and Commercial Overview

24 Jan 2024 | ASIF ALI KHAN, JOÃO PAULO C. DE LIMA, HAMID FARZANEH, JERONIMO CASTRILLON
The paper discusses the landscape of compute-near-memory (CNM) and compute-in-memory (CIM) systems, emphasizing their potential to address the limitations of conventional computing in data-intensive applications, particularly in machine learning. CNM and CIM aim to reduce data movement between memory and processing units, thereby improving performance, energy efficiency, and cost. While CNM involves placing processing units near memory, CIM performs computations directly within memory devices. These paradigms have gained traction due to the exponential growth in data volume and the need for more efficient computing solutions. The paper reviews the technical foundations of CNM and CIM, including their underlying technologies, working principles, and commercial applications. It highlights the challenges associated with these systems, such as the lack of a software ecosystem and the need for specialized hardware and software support. The paper also discusses the commercial trends in the in-memory computing market, noting its rapid growth and the increasing number of startups offering CIM and CNM solutions. Key technologies explored include SRAM, DRAM, PCM, RRAM, MRAM, and FeFETs, each with its own advantages and challenges. The paper presents various CNM and CIM architectures, such as UPMEM, McDRAM, MViD, and Samsung's PIM-HBM and AxDIMM, which demonstrate the feasibility of these systems in real-world applications. It also discusses the challenges in programming and optimizing these systems, as well as the importance of software support for their effective deployment. The paper concludes that while CIM and CNM systems are still in their early stages, they hold significant promise for future computing applications, particularly in machine learning and other data-intensive domains. The ongoing advancements in memory technologies and the growing demand for efficient computing solutions are expected to drive further innovation in this area.The paper discusses the landscape of compute-near-memory (CNM) and compute-in-memory (CIM) systems, emphasizing their potential to address the limitations of conventional computing in data-intensive applications, particularly in machine learning. CNM and CIM aim to reduce data movement between memory and processing units, thereby improving performance, energy efficiency, and cost. While CNM involves placing processing units near memory, CIM performs computations directly within memory devices. These paradigms have gained traction due to the exponential growth in data volume and the need for more efficient computing solutions. The paper reviews the technical foundations of CNM and CIM, including their underlying technologies, working principles, and commercial applications. It highlights the challenges associated with these systems, such as the lack of a software ecosystem and the need for specialized hardware and software support. The paper also discusses the commercial trends in the in-memory computing market, noting its rapid growth and the increasing number of startups offering CIM and CNM solutions. Key technologies explored include SRAM, DRAM, PCM, RRAM, MRAM, and FeFETs, each with its own advantages and challenges. The paper presents various CNM and CIM architectures, such as UPMEM, McDRAM, MViD, and Samsung's PIM-HBM and AxDIMM, which demonstrate the feasibility of these systems in real-world applications. It also discusses the challenges in programming and optimizing these systems, as well as the importance of software support for their effective deployment. The paper concludes that while CIM and CNM systems are still in their early stages, they hold significant promise for future computing applications, particularly in machine learning and other data-intensive domains. The ongoing advancements in memory technologies and the growing demand for efficient computing solutions are expected to drive further innovation in this area.
Reach us at info@study.space