close
close

Yiamastaverna

Trusted News & Timely Insights

Replace HBM used in AI GPUs with 100x more performance
Alabama

Replace HBM used in AI GPUs with 100x more performance

NEO Semiconductor has just announced the development of its new 3D X-AI chip technology, which aims to replace DRAM chips within HBM to solve data bus bottlenecks by enabling AI processing through 3D DRAM.

NEO Semiconductor's new 3D X-AI chip technology: Replace the HBM used in AI GPUs with 100x more performance 60

VIEW GALLERY – 2 PICTURES

The new 3D X-AI chip technology can reduce the enormous amount of data transferred between HBM and GPUs during AI workloads. NEO’s innovative new 3D X-AI chip technology is designed to “revolutionize the performance, power consumption and cost of AI chips for AI applications such as generative AI.”

NEO’s new 3D X-AI chip technology delivers 100x performance with 8000 neural circuits to perform AI processing in 3D memory, a massive 99% power consumption reduction, minimizing the need for data transfers to the GPU for calculation, reducing power consumption and heat generation from the data bus, and 8x the storage density with 300 memory layers, allowing HBM to store larger AI models.

A single 3D X-AI chip has 300 layers of 3D DRAM cells with a capacity of 128GB and a layer of neural circuits with 8000 neurons. NEO estimates that its new chip can support AI processing throughput of up to 10TB/s per chip, while with 12x 3D X-AI chips stacked with HBM packaging, a processing throughput of 120TB/s can be achieved, an incredible 100x performance increase.

Andy Hsu, Founder and CEO of NEO Semiconductor, said: “Current AI chips waste significant amounts of performance and power due to architectural and technological inefficiencies. The current AI chip architecture stores data in HBM and relies on a GPU for all computations. This separate data storage and data processing architecture makes the data bus an unavoidable performance bottleneck. Transferring large amounts of data over the data bus results in limited performance and very high power consumption. 3D X-AI can perform AI processing in each HBM chip. This can dramatically reduce the data transferred between HBM and GPU to improve performance and dramatically reduce power consumption“.

Jay Kramer, President of Network Storage Advisors, said: “The application of 3D X-AI technology can accelerate the development of new AI use cases and promote the creation of new“.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *