咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >CONV-SRAM: An Energy-Efficient... 收藏

CONV-SRAM: An Energy-Efficient SRAM With In-Memory Dot-Product Computation for Low-Power Convolutional Neural Networks

作     者:Biswas, Avishek Chandrakasan, Anantha P. 

作者机构:Texas Instruments Inc Kilby Labs Dallas TX 75243 USA MIT Dept Elect Engn & Comp Sci Cambridge MA 02139 USA 

出 版 物:《IEEE JOURNAL OF SOLID-STATE CIRCUITS》 (IEEE J Solid State Circuits)

年 卷 期:2019年第54卷第1期

页      面:217-230页

核心收录:

学科分类:0808[工学-电气工程] 08[工学] 

基  金:Intel Corporation 

主  题:Analog computing binary weights convolutional neural networks (CNNs) dot-product edge-computing energy-efficient SRAM in-memory computation machine learning (ML) 

摘      要:This paper presents an energy-efficient static random access memory (SRAM) with embedded dot-product computation capability, for binary-weight convolutional neural networks. A 10T bit-cell-based SRAM array is used to store the 1-b filter weights. The array implements dot-product as a weighted average of the bitline voltages, which are proportional to the digital input values. Local integrating analogto- digital converters compute the digital convolution outputs, corresponding to each filter. We have successfully demonstrated functionality ( 98% accuracy) with the 10 000 test images in the MNIST hand-written digit recognition data set, using 6-b inputs/outputs. Compared to conventional full-digital implementations using small bitwidths, we achieve similar or better energy efficiency, by reducing data transfer, due to the highly parallel in-memory analog computations.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分