Strukov "Brain-inspired-Computing"
Prof. Dmitri Strukov interviewed by Nature Communications about the challenges in developing neuromorphic computing
“Building Brain-Inspired-Computing”
Electrical & Computer Engineering Professor Dmitri Strukov and other researchers talked to Nature Communications about the opportunities and challenges in developing brain-inspired computing technologies, namely neuromorphic computing, and advocated effective collaborations crossing multidisciplinary research areas to support the emerging community
Please tell us about your research background and how it brought you to work on neuromorphic computing?
Dmitri Strukov (DS): I was trained as an electrical engineer and got interested in developing circuits and architectures using emerging electron devices in my graduate school at Stony Brook University. Afterwards, I moved to Hewlett Packard Laboratories as a postdoctoral researcher and switched my attention to device physics. I spent most of my time developing models for mixed electronic-ionic conductors that could be used to implement resistive switching devices (known as memristors nowadays). This experience naturally led me to choose neuromorphic computing—one of the most promising applications of memristors—as my research area after I joined University of California at Santa Barbara. My major focus now is on the development of practical mixed-signal circuits for artificial neural networks. This is a challenging topic because it spans across a broad range of disciplines, from electron devices to algorithms. In the long term, I hope that our research will lead to practically useful neuromorphic systems that will be used in everyday life.
Why do we need neuromorphic computing?
DS: The answer is quite obvious if one interprets neuromorphic computing as a biologically inspired computing technology facilitated by powerful deep learning algorithms that have already showed profound impact on science, technology, and our society. However, when considering the very original definition of neuromorphic computing coined by Carver Mead at Caltech, which can be loosely put as “analog computing hardware organized similarly to the brains”, the answer becomes less clear to me. This is in part because such definition still leaves some ambiguity in how closely neuromorphic computing hardware should emulate the brains and what functionalities are expected from such systems. One could call neuromorphic computing a hardware that is merely borrowing a few tricks from biology, such as perceptron-like distributed parallel information processing, to perform simple machine learning tasks. Conversely, should it also integrate more advanced functions (e.g. spike-time encoding, various types of plasticity, homeostasis, etc.) and be capable of realizing cognitive functions at higher levels? Nevertheless, the primary motivation is arguably to achieve the extreme energy efficiency of the brains using neuromorphic computing. In fact, this will be the main advantage of analog and mixed-signal implementations of simple perceptron networks as well as of advanced spiking neural networks. Some existing results, albeit they perform simple tasks like image classification, have shown many orders of magnitudes improvement in energy and speed compared to purely digital computing, and some of them can even surpass the performance of the human brain.
What can we learn from our brain for information processing? How to emulate human brain using electronic devices and where are we now?
DS: There is a general consensus on the usefulness of some tricks that are employed by the brains, such as analog and in-memory computing, massively parallel processing, spike coding, task-specific connectivity in neural networks. Many of these ideas have already been implemented in state-of-the-art neuromorphic systems. I do believe, however, that we should not blindly try to mimic all features of the brains—at least not doing so without having a good engineering reason first—and we should consider simpler approaches based on more conventional technologies to achieve the same goal. On the other hand, we should also keep in mind that over millions of years the evolution of biological brains has been constrained to biomaterials optimized for specific tasks, while we have a much wider range of material choices now in the context of neuromorphic engineering. Therefore, there could exist profound differences in designing rules. For example, the brains have to rely on poor conductors offered by biomaterials, which have presumably affected the principles of brain structure and operation in some ways that are not necessarily to be applicable to neuromorphic computing based on high conducting materials.
What are the major hurdles to date towards realizing neuromorphic computing from your perspective?
DS: In my opinion, there are tough challenges at several levels. From a technology perspective, the foremost challenge is various device non-idealities, such as the notorious device-to-device variations in their current-voltage characteristics and poor yields of memory devices—one of the key components of neuromorphic circuits (I will elaborate more on these issues in the answer to question 6). In addition to these technological hurdles, I reckon that there might be other substantial economical and confidence barriers to achieve such highly innovative, yet high-risk technology. Ultimately, to be successful, neuromorphic computing hardware would have to win competition over conventional digital circuits that are supported by presently available infrastructures and enormous investments over years. Fortunately, this barrier does not appear to be as bad as, say, 20 years ago, because of slowing down innovations (mainly about feature size scaling) in conventional CMOS technology, very high development and production cost of sub-10-nm CMOS circuits, and general trend towards more specialized computing hardware. Apart from hardware issues, the progress on the algorithmic front is clearly not sufficient to cope with the explosive increase in the need from neuromorphic computing either, especially for higher cognition tasks. The lack of suitable algorithms, in return, has imposed large uncertainty in designing neuromorphic hardware.
Additional Questions:
- What is your vision to tackle these major hurdles? Any suggestions?
- What could be the measure of when the neuromorphic computing is ready to replace the current digital computing?
- Any suggestion on how researchers, including but not limited to material scientists, device physicists, circuits engineers, computer scientists, neuroscientists or even policy makers, can better work together in this very multidisciplinary field?
Nature Communications – “Building Brain-Inspired-Computing” (full article)