Skip to content
2000
Volume 1, Issue 1
  • ISSN: 2665-9972
  • E-ISSN: 2665-9964

Abstract

In this paper, we present a theoretical discussion on neuromorphic computing circuit dynamic and its relations with AI deep learning neural networks. The hardware implememtations of neuromorphic computing and AI deep learning neuronal networks are discussed.

The investigation is motivated by the design of a feasible fast and energy-efficient circuit device as well as an efficient training computation method to solve complex classification problems using AI neuronal networks.

We focus on the investigations of solving pattern classification and recognition problems in real applications from the perspectives of both logic computation view and physical circuit device views. FPGA approaches are considered and a mapping from logic level to physical level is proposed.

A pragmatic mapping method is derived. FPGA method is proposed.

Thus we propose in this paper an approach to solve complex classification problems. First, the neuromorphic computing as a new research area is introduced, including physical circuit properties, memristive device physical properties and the circuit dynamics described by the temporal and spatial (Maxwell) differential equations. Secondly, we show that by using AI deep learning neural networks to train AI neural networks we are able to derive the optimal AI neuron network weights. Last but not the least, we brief a mapping method and show in general how the neuromorphic circuit will work in practice after mapping the weights from AI deep learning neural networks into the neuromorphic circuit synapses/memristors. We also devote our discussions to the physical device feasibility and related matters. The method proposed in this paper is pragmatic and constructive.

Loading

Article metrics loading...

/content/journals/cccs/10.2174/2665997201999201027150756
2021-04-01
2024-11-26
Loading full text...

Full text loading...

References

  1. Steven K. Esser, Convolutional networks for fast, energy-efficient neuromorphic computing.", PNAS, vol. 113, no. 41, pp. 11441- 11446, 2016.10.1073/pnas.1604850113
    [Google Scholar]
  2. Dmitri B.Strukov The missing memristor found.NATURE vol 4532008
    [Google Scholar]
  3. C. Mead, "Neuromorphic Engineering: Overview and PotentialProceedings of International Joint Conference on Neural Networks MontrealCanada
    [Google Scholar]
  4. F.Caravelli The complex dynamics of memristive circuits: analytical results and universal slow relaxation", arXIV: 1608.08651v3, 2016.
    [Google Scholar]
  5. Mingyong Zhou, “A Theory on AI Uncertainty Based on Rademacher Complexity and Shannon Entropy ”, proceedings of IEEE 3rd International Conference of Sfate Production and Information (IICSPI), 2020.2020
    [Google Scholar]
  6. Andreas Kirsch '"An introduction to the mathematical theory of inverse problems ”, 2nd edition, Applied Mathematics Sciences, Volume 120, Springer 2011. 10.1007/978‑1‑4419‑8474‑6
    [Google Scholar]
  7. Simon Haykin, “Neural Networks and Learning Machines “, 3rd edition, Pearson Education, Inc, Prentice Hall.
  8. Yi Li, " Activity-Dependent Synaptic Plasticity of a Chalcogenide Electronic Synapse for Neuromorphic Systems", Scientific Reports. 10.1038/srep04906
    [Google Scholar]
  9. IremBoybat , "Neuromorphic computing with multi-memristive synapses", Nature Communications, vol. 9, p. 2514, 2018.www.nature.com/naturecommunications10.1038/s41467‑018‑04933‑y
    [Google Scholar]
/content/journals/cccs/10.2174/2665997201999201027150756
Loading
/content/journals/cccs/10.2174/2665997201999201027150756
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test