Memristive Devices for Brain-Inspired Computing From Materials, Devices, and Circuits to Applications - Computational Memory, Deep Learning, and Spiking Neural Networks Woodhead Publishing Series in Electronic and Optical Materials Series
Coordonnateurs : Spiga Sabina, Sebastian Abu, Querlioz Damien, Rajendran Bipin
Memristive Devices for Brain-Inspired Computing: From Materials, Devices, and Circuits to Applications?Computational Memory, Deep Learning, and Spiking Neural Networks reviews the latest in material and devices engineering for optimizing memristive devices beyond storage applications and toward brain-inspired computing. The book provides readers with an understanding of four key concepts, including materials and device aspects with a view of current materials systems and their remaining barriers, algorithmic aspects comprising basic concepts of neuroscience as well as various computing concepts, the circuits and architectures implementing those algorithms based on memristive technologies, and target applications, including brain-inspired computing, computational memory, and deep learning.
This comprehensive book is suitable for an interdisciplinary audience, including materials scientists, physicists, electrical engineers, and computer scientists.
Part I Memristive devices for brain–inspired computing 1. Role of resistive memory devices in brain-inspired computing 2. Resistive switching memories 3. Phase change memories 4. Magnetic and Ferroelectric memories 5. Selectors for resistive memory devices
Part II Computational Memory 6. Memristive devices as computational memory 7. Logical operations 8. Hyperdimensional Computing Nanosystem: In-memory Computing using Monolithic 3D Integration of RRAM and CNFET 9. Matrix vector multiplications using memristive devices and applications thereof 10. Computing with device dynamics 11. Exploiting stochasticity for computing
Part III Deep learning 12. Memristive devices for deep learning applications 13. PCM based co-processors for deep learning 14. RRAM based co-processors for deep learning
Part IV Spiking neural networks 15. Memristive devices for spiking neural networks 16. Neuronal realizations based on memristive devices 17. Synaptic realizations based on memristive devices 18. Neuromorphic co-processors and experimental demonstrations 19. Recent theoretical developments and applications of spiking neural networks
IBM Research – Zurich, Switzerland
Abu Sebastian received a B. E. (Hons.) degree in Electrical and Electronics Engineering from BITS Pilani, India, in 1998 and M.S. and Ph.D. degrees in Electrical Engineering (minor in Mathematics) from Iowa State University in 1999 and 2004, respectively. Since 2006, he is a Research Staff Member at IBM Research - Zurich in Rüschlikon, Switzerland.
He was a contributor to several key projects in the space of storage and memory technologies such as probe-based data storage, phase-change memory and carbon-based memory. Most recently, he is actively researching the area of non-von Neumann computing with the intent of connecting the technological elements with applications such as cognitive computing. He has published over 140 articles in journals and conference proceedings. He also holds over 30 granted patents. He is a co-recipient of the 2009 IEEE Control Systems Technology Award. In 2015 he was awarded the European Research Council (ERC) consolidato
- Provides readers an overview of four key concepts in this emerging research topic including materials and device aspects, algorithmic aspects, circuits and architectures and target applications
- Covers a broad range of applications, including brain-inspired computing, computational memory, deep learning and spiking neural networks
- Includes perspectives from a wide range of disciplines, including materials science, electrical engineering and computing, providing a unique interdisciplinary look at the field
Date de parution : 06-2020
Ouvrage de 564 p.
15x22.8 cm
Thèmes de Memristive Devices for Brain-Inspired Computing :
Mots-clés :
?3D integration; Accumulative switching; Action potential; Algorithms; Backpropagation; Bayesian inference; Brain-inspired computing; CBRAM; Carbon nanotube (CNT); Carbon nanotube FET (CNFET); Chaos; Computing; Computing-in-memory; Conductive filament; Cross-point array; Crossbar array; Deep learning; Deep neural networks; Device physics; Dot-product; Edge of chaos; Edge-of-chaos; FTJ; FeFET; FeRAM; Ferroelectric; Ferromagnetic; Hardware implementation; Hardware neural networks; Hyperdimensional computing; IMT; In-memory; In-memory computing; Interface switching; Logic-in-memory; MAGIC; MRAM; MTJ; Matrix multiplication; Memristive devices; Memristor; Memristors; Metal/insulator transition; Metal–insulator transition; Multilevel; NP-complete; NP-hard; Nanosystems; Neural networks; Neuromorphic; Neuromorphic computing; Neuromorpic computing; Neuron; Neuronal; Neurons; Non-von Neumann; Nonindealities; Nonlinear dynamics; Nonvolatile memory; Nonvolatile switching; OTS; PCM; Phase-change memory; Plasticity; Polarity inversion; Population coding; RRAM; ReRAM; Resistive RAM (RRAM); Resistive memory; Resistive processing unit (RPU); Resistive switching memory; Selector device; Spiking neural network; Spiking neural networks; Spin-transfer torque; Spintronics; Stochastic computing; Stochastic resonance; Synapse; Synapses; Synaptic plasticity rules; Three-factor rules; Threshold switching; Tunnel magnetoresistance (TMR); Volatile switching; Von Neumann bottleneck