Full text loading...
-
Can One Design a Series of Fast, Energy - Efficient “Brains” based on Neuromorphic Computing to Solve Complex Pattern Recognition Problem?
- Source: Current Chinese Computer Science, Volume 1, Issue 1, Apr 2021, p. 2 - 5
-
- 22 Jun 2020
- 15 Sep 2020
- 01 Apr 2021
Abstract
Background: In this paper, we present a theoretical discussion on neuromorphic computing circuit dynamic and its relations with AI deep learning neural networks. The hardware implememtations of neuromorphic computing and AI deep learning neuronal networks are discussed.
Objective: The investigation is motivated by the design of a feasible fast and energy-efficient circuit device as well as an efficient training computation method to solve complex classification problems using AI neuronal networks.
Methods: We focus on the investigations of solving pattern classification and recognition problems in real applications from the perspectives of both logic computation view and physical circuit device views. FPGA approaches are considered and a mapping from logic level to physical level is proposed.
Results: A pragmatic mapping method is derived. FPGA method is proposed.
Conclusion: Thus we propose in this paper an approach to solve complex classification problems. First, the neuromorphic computing as a new research area is introduced, including physical circuit properties, memristive device physical properties and the circuit dynamics described by the temporal and spatial (Maxwell) differential equations. Secondly, we show that by using AI deep learning neural networks to train AI neural networks we are able to derive the optimal AI neuron network weights. Last but not the least, we brief a mapping method and show in general how the neuromorphic circuit will work in practice after mapping the weights from AI deep learning neural networks into the neuromorphic circuit synapses/memristors. We also devote our discussions to the physical device feasibility and related matters. The method proposed in this paper is pragmatic and constructive.