Microchip’s SuperFlash memBrain Neuromorphic Memory Solution

provides substantial reduction in compute power to improve AI Inference at the edge

As artificial intelligence (AI) processing moves from the cloud to the edge of the network, battery powered and deeply embedded devices are challenged to perform AI functions—like computer vision and voice recognition. Microchip Technology via its Silicon Storage Technology (SST) subsidiary, is addressing this challenge by significantly reducing power with its analog memory technology, the memBrain neuromorphic memory solution.

Based on its industry proven SuperFlash technology and optimized to perform vector matrix multiplication (VMM) for neural networks, Microchip’s analog flash memory solution improves system architecture implementation of VMM through an analog in-memory compute approach, enhancing AI inference at the edge.

As current neural net models may require 50M or more synapses (weights) for processing, it becomes challenging to have enough bandwidth for an off-chip DRAM, creating a bottleneck for neural net computing and an increase in overall compute power. In contrast, the memBrain solution stores synaptic weights in the on-chip floating gate—offering significant improvements in system latency. When compared to traditional digital DSP and SRAM/DRAM based approaches, it delivers 10 to 20 times lower power and significantly reduced overall BOM.

As technology providers for the automotive, industrial and consumer markets continue to implement VMM for neural networks, our architecture helps these forward-facing solutions realize power, cost and latency benefits,” said Mark Reiten, vice president of the license division at SST. “Microchip will continue to deliver highly reliable and versatile SuperFlash memory solutions for AI applications.”

The memBrain solution is being adopted by today’s companies looking to advance machine learning capacities in edge devices. Due to its ability to significantly reduce power, this analog in-memory compute solution is ideal for any AI application.

Microchip’s memBrain solution enables ultra-low-power in-memory computation for our forthcoming analog neural network processors,” said Kurt Busch, CEO of Syntiant Corp. “Our partnership with Microchip continues to offer Syntiant many critical advantages as we support pervasive machine learning for always-on applications in voice, image and other sensor modalities in edge devices.”

SST will showcase this analog memory solution and present Microchip’s memBrain product tile array-based architecture at the AI/ML session track on flash performance scaling at the 2019 Flash Memory Summit from August 6-8, 2019, at the Santa Clara Convention Center in Santa Clara, California.

SHARE
Previous articleWith AAEON Computer-on-Module bring edge computing to your project
Next articleKeysight Joins 6G Flagship Program
Electronics Media is an Indian electronics and tech journalism platform dedicated for international electronics and tech industry. EM covers news from semiconductor, aerospace, defense-e, IOT, design, tech startup, emerging technology, innovation and business trends worldwide. Follow us on twitter for latest update in industry.