Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Perspective
  • Published:

Opportunities for neuromorphic computing algorithms and applications

A Publisher Correction to this article was published on 11 March 2022

This article has been updated

Abstract

Neuromorphic computing technologies will be important for the future of computing, but much of the work in neuromorphic computing has focused on hardware development. Here, we review recent results in neuromorphic computing algorithms and applications. We highlight characteristics of neuromorphic computing technologies that make them attractive for the future of computing and we discuss opportunities for future development of algorithms and applications on these systems.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Comparison of the von Neumann architecture with the neuromorphic architecture.
Fig. 2: Common training approaches for SNNs.
Fig. 3: Opportunity for full compute stack co-design in neuromorphic computers.

Similar content being viewed by others

Change history

References

  1. Mead, C. Neuromorphic electronic systems. Proc. IEEE 78, 1629–1636 (1990).

    Article  Google Scholar 

  2. Mead, C. How we created neuromorphic engineering. Nat. Electron. 3, 434–435 (2020).

    Article  Google Scholar 

  3. Schuman, C. D., Plank, J. S., Bruer, G. & Anantharaj, J. Non-traditional input encoding schemes for spiking neuromorphic systems. In 2019 International Joint Conference on Neural Networks (IJCNN) 1–10 (IEEE, 2019).

  4. Sze, V., Chen, Y.-H., Emer, J., Suleiman, A. & Zhang, Z. Hardware for machine learning: challenges and opportunities. In 2017 IEEE Custom Integrated Circuits Conference (CICC) 1–8 (IEEE, 2017).

  5. Mayr, C., Hoeppner, S. & Furber, S. SpiNNaker 2: a 10 million core processor system for brain simulation and machine learning. Preprint at https://arxiv.org/abs/1911.02385 (2019).

  6. Furber, S. B., Galluppi, F., Temple, S. & Plana, L. A. The SpiNNaker project. Proc. IEEE 102, 652–665 (2014).

    Article  Google Scholar 

  7. Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).

    Article  Google Scholar 

  8. Mostafa, H., Müller, L. K. & Indiveri, G. An event-based architecture for solving constraint satisfaction problems. Nat. Commun. 6, 1–10 (2015).

    Article  Google Scholar 

  9. Amir, A. et al. A low power, fully event-based gesture recognition system. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 7388–7397 (IEEE, 2017).

  10. Schuman, C. D. et al. A survey of neuromorphic computing and neural networks in hardware. Preprint at https://arxiv.org/abs/1705.06963 (2017).

  11. James, C. D. et al. A historical survey of algorithms and hardware architectures for neural-inspired and neuromorphic computing applications. Biol. Inspired Cogn. Archit. 19, 49–64 (2017).

    Google Scholar 

  12. Strukov, D., Indiveri, G., Grollier, J. & Fusi, S. Building brain-inspired computing. Nat. Commun. 10, 4838–2019 (2019).

  13. Thakur, C. S. et al. Large-scale neuromorphic spiking array processors: a quest to mimic the brain. Front. Neurosci. 12, 891 (2018).

    Article  Google Scholar 

  14. Davies, M. et al. Advancing neuromorphic computing with Loihi: a survey of results and outlook. Proc. IEEE 109, 911–934 (2021).

    Article  Google Scholar 

  15. Aimone, J. B. et al. Non-neural network applications for spiking neuromorphic hardware. In Proc. 3rd International Workshop on Post Moores Era Supercomputing 24–26 (PMES, 2018).

  16. Polykretis, I., Tang, G. & Michmizos, K. P. An astrocyte-modulated neuromorphic central pattern generator for hexapod robot locomotion on intel’s Loihi. In International Conference on Neuromorphic Systems 2020 1–9 (ACM, 2020).

  17. Irizarry-Valle, Y. & Parker, A. C. An astrocyte neuromorphic circuit that influences neuronal phase synchrony. IEEE Trans. Biomed. circuits Syst. 9, 175–187 (2015).

    Article  Google Scholar 

  18. Potok, T., Schuman, C., Patton, R. & Li, H. Neuromorphic Computing, Architectures, Models, and Applications. A Beyond-CMOS Approach to Future Computing (US Department of Energy, 2016).

  19. Yin, S. et al. Algorithm and hardware design of discrete-time spiking neural networks based on back propagation with binary activations. In 2017 IEEE Biomedical Circuits and Systems Conference (BioCAS) 1–5 (IEEE, 2017).

  20. Schemmel, J. et al. A wafer-scale neuromorphic hardware system for large-scale neural modeling. In 2010 IEEE International Symposium on Circuits and Systems (ISCAS) 1947–1950 (IEEE, 2010).

  21. Neftci, E. O., Mostafa, H. & Zenke, F. Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. Mag. 36, 51–63 (2019).

    Article  Google Scholar 

  22. Pei, J. et al. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature 572, 106–111 (2019).

    Article  Google Scholar 

  23. Merolla, P. A. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345, 668–673 (2014).

    Article  Google Scholar 

  24. Moradi, S., Qiao, N., Stefanini, F. & Indiveri, G. A scalable multicore architecture with heterogeneous memory structures for dynamic neuromorphic asynchronous processors (DYNAPs). IEEE Trans. Biomed. Circuits Syst. 12, 106–122 (2017).

    Article  Google Scholar 

  25. Benjamin, B. V. et al. Neurogrid: a mixed-analog-digital multichip system for large-scale neural simulations. Proc. IEEE 102, 699–716 (2014).

    Article  Google Scholar 

  26. Schemmel, J., Billaudelle, S., Dauer, P. & Weis, J. Accelerated analog neuromorphic computing. Preprint at https://arxiv.org/abs/2003.11996 (2020).

  27. Bohnstingl, T., Scherr, F., Pehle, C., Meier, K. & Maass, W. Neuromorphic hardware learns to learn. Front. Neurosci. 13, 483 (2019).

    Article  Google Scholar 

  28. Islam, R. et al. Device and materials requirements for neuromorphic computing. J. Phys. D 52, 113001 (2019).

    Article  Google Scholar 

  29. Nandakumar, S., Kulkarni, S. R., Babu, A. V. & Rajendran, B. Building brain-inspired computing systems: examining the role of nanoscale devices. IEEE Nanotechnol. Mag. 12, 19–35 (2018).

    Article  Google Scholar 

  30. Najem, J. S. et al. Memristive ion channel-doped biomembranes as synaptic mimics. ACS Nano 12, 4702–4711 (2018).

    Article  Google Scholar 

  31. Jo, S. H. et al. Nanoscale memristor device as synapse in neuromorphic systems. Nano Lett. 10, 1297–1301 (2010).

    Article  Google Scholar 

  32. Li, Y., Wang, Z., Midya, R., Xia, Q. & Yang, J. J. Review of memristor devices in neuromorphic computing: materials sciences and device challenges. J. Phys. D 51, 503002 (2018).

    Article  Google Scholar 

  33. Wu, Y., Deng, L., Li, G., Zhu, J. & Shi, L. Spatio-temporal backpropagation for training high-performance spiking neural networks. Front. Neurosci. 12, 331 (2018).

    Article  Google Scholar 

  34. Kulkarni, S. R. & Rajendran, B. Spiking neural networks for handwritten digit recognition–supervised learning and network optimization. Neural Netw. 103, 118–127 (2018).

    Article  Google Scholar 

  35. Anwani, N. & Rajendran, B. Training multi-layer spiking neural networks using normad based spatio-temporal error backpropagation. Neurocomputing 380, 67–77 (2020).

    Article  Google Scholar 

  36. Bagheri, A., Simeone, O. & Rajendran, B. Training probabilistic spiking neural networks with first-to-spike decoding. In 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2986–2990 (IEEE, 2018).

  37. Göltz, J. et al. Fast and deep neuromorphic learning with time-to-first-spike coding. Preprint at https://arxiv.org/abs/1912.11443 (2019).

  38. Lee, J. H., Delbruck, T. & Pfeiffer, M. Training deep spiking neural networks using backpropagation. Front. Neurosci. 10, 508 (2016).

    Article  Google Scholar 

  39. Lee, C., Sarwar, S. S., Panda, P., Srinivasan, G. & Roy, K. Enabling spike-based backpropagation for training deep neural network architectures. Front. Neurosci. https://doi.org/10.3389/fnins.2020.00119 (2020).

  40. Zenke, F. & Neftci, E. O. Brain-inspired learning on neuromorphic substrates. Proc. IEEE 109, 935–950 (2021).

  41. Cramer, B., Stradmann, Y., Schemmel, J. & Zenke, F. The Heidelberg spiking data sets for the systematic evaluation of spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. https://doi.org/10.1109/TNNLS.2020.3044364 (2020).

  42. Diehl, P. U., Zarrella, G., Cassidy, A., Pedroni, B. U. & Neftci, E. Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware. In 2016 IEEE International Conference on Rebooting Computing (ICRC) 1–8 (IEEE, 2016).

  43. Hunsberger, E. & Eliasmith, C. Training spiking deep networks for neuromorphic hardware. Preprint at https://arxiv.org/abs/1611.05141 (2016).

  44. Sengupta, A., Ye, Y., Wang, R., Liu, C. & Roy, K. Going deeper in spiking neural networks: VGG and residual architectures. Front. Neurosci. 13, 95 (2019).

    Article  Google Scholar 

  45. Severa, W., Vineyard, C. M., Dellana, R., Verzi, S. J. & Aimone, J. B. Training deep neural networks for binary communication with the whetstone method. Nat. Mach. Intell. 1, 86–94 (2019).

    Article  Google Scholar 

  46. Rueckauer, B., Lungu, I.-A., Hu, Y., Pfeiffer, M. & Liu, S.-C. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11, 682 (2017).

    Article  Google Scholar 

  47. Stöckl, C. & Maass, W. Optimized spiking neurons can classify images with high accuracy through temporal coding with two spikes. Nat. Mach. Intell. 3, 230–238 (2021).

    Article  Google Scholar 

  48. Blouw, P., Choo, X., Hunsberger, E. & Eliasmith, C. Benchmarking keyword spotting efficiency on neuromorphic hardware. In NICE '19: Proc. 7th Annual Neuro-inspired Computational Elements Workshop 1–8 (ACM, 2019).

  49. Getty, N., Brettin, T., Jin, D., Stevens, R. & Xia, F. Deep medical image analysis with representation learning and neuromorphic computing. Interface Focus 11, 20190122 (2021).

    Article  Google Scholar 

  50. Shukla, R., Lipasti, M., Van Essen, B., Moody, A. & Maruyama, N. Remodel: rethinking deep CNN models to detect and count on a neurosynaptic system. Front. Neurosci. 13, 4 (2019).

    Article  Google Scholar 

  51. Tanaka, G. et al. Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019).

    Article  Google Scholar 

  52. Kudithipudi, D., Saleh, Q., Merkel, C., Thesing, J. & Wysocki, B. Design and analysis of a neuromemristive reservoir computing architecture for biosignal processing. Front. Neurosci. 9, 502 (2016).

    Article  Google Scholar 

  53. Du, C. et al. Reservoir computing using dynamic memristors for temporal information processing. Nat. Commun. 8, 1–10 (2017).

    Article  Google Scholar 

  54. Wijesinghe, P., Srinivasan, G., Panda, P. & Roy, K. Analysis of liquid ensembles for enhancing the performance and accuracy of liquid state machines. Front. Neurosci. 13, 504 (2019).

    Article  Google Scholar 

  55. Soures, N. & Kudithipudi, D. Deep liquid state machines with neural plasticity for video activity recognition. Front. Neurosci. 13, 686 (2019).

    Article  Google Scholar 

  56. Schuman, C. D., Mitchell, J. P., Patton, R. M., Potok, T. E. & Plank, J. S. Evolutionary optimization for neuromorphic systems. In Proc. Neuro-inspired Computational Elements Workshop 1–9 (ACM, 2020).

  57. Schaffer, J. D. Evolving spiking neural networks for robot sensory-motor decision tasks of varying difficulty. In Proc. Neuro-inspired Computational Elements Workshop 1–7 (ACM, 2020).

  58. Schliebs, S. & Kasabov, N. Evolving spiking neural network–a survey. Evol. Syst. 4, 87–98 (2013).

    Article  Google Scholar 

  59. Plank, J. S. et al. The TENNLab suite of LIDAR-based control applications for recurrent, spiking, neuromorphic systems. In 44th Annual GOMACTech Conference (GOMAC Tech, 2019); http://neuromorphic.eecs.utk.edu/raw/files/publications/2019-Plank-Gomac.pdf

  60. Mitchell, J. P. et al. Neon: neuromorphic control for autonomous robotic navigation. In 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS) 136–142 (IEEE, 2017).

  61. Bi, G.-q & Poo, M.-m Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J. Neurosci. 18, 10464–10472 (1998).

    Article  Google Scholar 

  62. Shrestha, A., Ahmed, K., Wang, Y. & Qiu, Q. Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning. In 2017 International Joint Conference on Neural Networks (IJCNN) 1999–2006 (IEEE, 2017).

  63. Mozafari, M., Kheradpisheh, S. R., Masquelier, T., Nowzari-Dalini, A. & Ganjtabesh, M. First-spike-based visual categorization using reward-modulated stdp. IEEE Trans. Neural Netw. Learn. Syst. 29, 6178–6190 (2018).

    Article  Google Scholar 

  64. Lee, C., Panda, P., Srinivasan, G. & Roy, K. Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning. Front. Neurosci. 12, 435 (2018).

    Article  Google Scholar 

  65. Kaiser, J., Mostafa, H. & Neftci, E. Synaptic plasticity dynamics for deep continuous local learning (decolle). Front. Neurosci. 14, 424 (2020).

    Article  Google Scholar 

  66. Bellec, G. et al. A solution to the learning dilemma for recurrent networks of spiking neurons. Nat. Commun. 11, 1–15 (2020).

    Article  Google Scholar 

  67. Martin, E. et al. EqSpike: spike-driven equilibrium propagation for neuromorphic implementations. iScience 24, 102222 (2021).

    Article  Google Scholar 

  68. Mukhopadhyay, A. K., Sharma, A., Chakrabarti, I., Basu, A. & Sharad, M. Power-efficient spike sorting scheme using analog spiking neural network classifier. ACM J. Emerg. Technol. Comput. Syst. 17, 1–29 (2021).

    Article  Google Scholar 

  69. Nessler, B., Pfeiffer, M., Buesing, L. & Maass, W. Bayesian computation emerges in generic cortical microcircuits through spike-timing-dependent plasticity. PLoS Comput. Biol. 9, e1003037 (2013).

    Article  MathSciNet  Google Scholar 

  70. Kasabov, N. K. NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data. Neural Netw. 52, 62–76 (2014).

    Article  Google Scholar 

  71. Budhraja, S. et al. Sleep stage classification using neucube on spinnaker: a preliminary study. In 2020 International Joint Conference on Neural Networks (IJCNN) 1–8 (IEEE, 2020).

  72. Kumarasinghe, K., Owen, M., Taylor, D., Kasabov, N. & Kit, C. FaNeuRobot: a framework for robot and prosthetics control using the neucube spiking neural network architecture and finite automata theory. In 2018 IEEE International Conference on Robotics and Automation (ICRA) 4465–4472 (IEEE, 2018).

  73. Izhikevich, E. M. Polychronization: computation with spikes. Neural Comput. 18, 245–282 (2006).

    Article  MathSciNet  MATH  Google Scholar 

  74. Wang, F., Severa, W. M. & Rothganger, F. Acquisition and representation of spatio-temporal signals in polychronizing spiking neural networks. In Proc. 7th Annual Neuro-inspired Computational Elements Workshop 1–5 (ACM, 2019).

  75. Alemi, A., Machens, C., Deneve, S. & Slotine, J.-J. Learning nonlinear dynamics in efficient, balanced spiking networks using local plasticity rules. In Proc. AAAI Conference on Artificial Intelligence Vol. 32 (AAAI, 2018).

  76. Maass, W. On the computational power of winner-take-all. Neural Comput. 12, 2519–2535 (2000).

    Article  Google Scholar 

  77. Oster, M., Douglas, R. & Liu, S.-C. Computation with spikes in a winner-take-all network. Neural Comput. 21, 2437–2465 (2009).

    Article  MathSciNet  MATH  Google Scholar 

  78. Kappel, D., Nessler, B. & Maass, W. STDP installs in winner-take-all circuits an online approximation to hidden Markov model learning. PLoS Comput. Biol. 10, e1003511 (2014).

    Article  Google Scholar 

  79. Gütig, R. & Sompolinsky, H. The tempotron: a neuron that learns spike timing–based decisions. Nat. Neurosci. 9, 420–428 (2006).

    Article  Google Scholar 

  80. Bohte, S. M., Kok, J. N. & La Poutre, H. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48, 17–37 (2002).

    Article  MATH  Google Scholar 

  81. Wang, Q., Rothkopf, C. A. & Triesch, J. A model of human motor sequence learning explains facilitation and interference effects based on spike-timing dependent plasticity. PLoS Comput. Biol. 13, e1005632 (2017).

    Article  Google Scholar 

  82. Li, S. & Yu, Q. New efficient multi-spike learning for fast processing and robust learning. In Proc. AAAI Conference on Artificial Intelligence Vol. 34, 4650–4657 (AAAI, 2020).

  83. Zenke, F. & Ganguli, S. Superspike: supervised learning in multilayer spiking neural networks. Neural Comput. 30, 1514–1541 (2018).

    Article  MathSciNet  MATH  Google Scholar 

  84. Petro, B., Kasabov, N. & Kiss, R. M. Selection and optimization of temporal spike encoding methods for spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 31, 358–370 (2019).

    Article  Google Scholar 

  85. Hamilton, K. E., Mintz, T. M. & Schuman, C. D. Spike-based primitives for graph algorithms. Preprint at https://arxiv.org/abs/1903.10574 (2019).

  86. Corder, K., Monaco, J. V. & Vindiola, M. M. Solving vertex cover via ising model on a neuromorphic processor. In 2018 IEEE International Symposium on Circuits and Systems (ISCAS) 1–5 (IEEE, 2018).

  87. Kay, B., Date, P. & Schuman, C. Neuromorphic graph algorithms: extracting longest shortest paths and minimum spanning trees. In Proc. Neuro-Inspired Computational Elements Workshop 1–6 (ACM, 2020).

  88. Ali, A. & Kwisthout, J. A spiking neural algorithm for the network flow problem. Preprint at https://arxiv.org/abs/1911.13097 (2019).

  89. Aimone, J. B. et al. Provable neuromorphic advantages for computing shortest paths. In Proc. 32nd ACM Symposium on Parallelism in Algorithms and Architectures 497–499 (ACM, 2020).

  90. Hamilton, K., Date, P., Kay, B. & Schuman D, C. Modeling epidemic spread with spike-based models. In International Conference on Neuromorphic Systems 2020 1–5 (ACM, 2020).

  91. Severa, W., Lehoucq, R., Parekh, O. & Aimone, J. B. Spiking neural algorithms for Markov process random walk. In 2018 International Joint Conference on Neural Networks (IJCNN) 1–8 (IEEE, 2018).

  92. Smith, J. D. et al. Neuromorphic scaling advantages for energy-efficient random walk computations. Preprint at https://arxiv.org/abs/2107.13057 (2021).

  93. Cook, M. Networks of Relations (California Institute of Technology, 2005).

  94. Diehl, P. U. & Cook, M. Learning and inferring relations in cortical networks. Preprint at https://arxiv.org/abs/1608.08267 (2016).

  95. Diehl, P. U. & Cook, M. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front. Comput. Neurosci. 9, 99 (2015).

    Article  Google Scholar 

  96. Alom, M. Z., Van Essen, B., Moody, A. T., Widemann, D. P. & Taha, T. M. Quadratic unconstrained binary optimization (QUBO) on neuromorphic computing system. In 2017 International Joint Conference on Neural Networks (IJCNN) 3922–3929 (IEEE, 2017).

  97. Mniszewski, S. M. Graph partitioning as quadratic unconstrained binary optimization (QUBO) on spiking neuromorphic hardware. In Proc. International Conference on Neuromorphic Systems 1–5 (ACM, 2019).

  98. Yakopcic, C., Rahman, N., Atahary, T., Taha, T. M. & Douglass, S. Solving constraint satisfaction problems using the Loihi spiking neuromorphic processor. In 2020 Design, Automation & Test in Europe Conference & Exhibition (DATE) 1079–1084 (IEEE, 2020).

  99. Mostafa, H., Müller, L. K. & Indiveri, G. Rhythmic inhibition allows neural networks to search for maximally consistent states. Neural Comput. 27, 2510–2547 (2015).

    Article  MathSciNet  MATH  Google Scholar 

  100. Fonseca Guerra, G. A. & Furber, S. B. Using stochastic spiking neural networks on SpiNNaker to solve constraint satisfaction problems. Front. Neurosci. 11, 714 (2017).

    Article  Google Scholar 

  101. Pecevski, D., Buesing, L. & Maass, W. Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons. PLoS Comput. Biol. 7, e1002294 (2011).

    Article  MathSciNet  Google Scholar 

  102. Dagum, P. & Luby, M. An optimal approximation algorithm for Bayesian inference. Artif. Intell. 93, 1–27 (1997).

    Article  MathSciNet  MATH  Google Scholar 

  103. Knight, J. C. & Nowotny, T. GPUs outperform current HPC and neuromorphic solutions in terms of speed and energy when simulating a highly-connected cortical model. Front. Neurosci. 12, 941 (2018).

    Article  Google Scholar 

  104. Gewaltig, M.-O. & Diesmann, M. NEST (Neural Simulation Tool). Scholarpedia 2, 1430 (2007).

    Article  Google Scholar 

  105. Goodman, D. F. & Brette, R. Brian: a simulator for spiking neural networks in python. Front. Neuroinform. 2, 5 (2008).

    Article  Google Scholar 

  106. Bekolay, T. et al. Nengo: a python tool for building large-scale functional brain models. Front. Neuroinform. 7, 48 (2014).

    Article  Google Scholar 

  107. Stewart, T. C. A Technical Overview of the Neural Engineering Framework (University of Waterloo, 2012).

  108. Kulkarni, S. R., Parsa, M., Mitchell, J. P. & Schuman, C. D. Benchmarking the performance of neuromorphic and spiking neural network simulators. Neurocomputing 447, 145–160 (2021).

  109. Vetter, J. S. et al. Extreme Heterogeneity 2018-Productive Computational Science in the Era of Extreme Heterogeneity: Report for DOE ASCR Workshop on Extreme Heterogeneity Technical Report (US Department of Energy, 2018).

  110. Diamond, A., Nowotny, T. & Schmuker, M. Comparing neuromorphic solutions in action: implementing a bio-inspired solution to a benchmark classification task on three parallel-computing platforms. Front. Neurosci. 9, 491 (2016).

    Article  Google Scholar 

  111. Mishkin, D., Sergievskiy, N. & Matas, J. Systematic evaluation of convolution neural network advances on the imagenet. Computer Vis. Image Underst. 161, 11–19 (2017).

    Article  Google Scholar 

  112. Orchard, G., Jayawant, A., Cohen, G. K. & Thakor, N. Converting static image datasets to spiking neuromorphic datasets using Saccades. Front. Neurosci. 9, 437 (2015).

    Article  Google Scholar 

  113. Tuggener, L., Schmidhuber, J. & Stadelmann, T. Is it enough to optimize CNN architectures on ImageNet? Preprint at https://arxiv.org/abs/2103.09108 (2021).

  114. Sandamirskaya, Y. Dynamic neural fields as a step toward cognitive neuromorphic architectures. Front. Neurosci. 7, 276 (2014).

    Article  Google Scholar 

  115. Plank, J. S., Zheng, C., Schumann, C. D. & Dean, C. Spiking neuromorphic networks for binary tasks. In International Conference on Neuromorphic Computing Systems (ICONS) 1–8 (ACM, 2021).

  116. Smith, J. D. et al. Solving a steady-state PDE using spiking networks and neuromorphic hardware. In International Conference on Neuromorphic Systems 2020 1–8 (ACM, 2020).

  117. Aimone, J. B. A roadmap for reaching the potential of brain-derived computing. Adv. Intell. Syst. 3, 2000191 (2021).

    Article  Google Scholar 

  118. Douglas, R., Mahowald, M. & Mead, C. Neuromorphic analogue VLSI. Annu. Rev. Neurosci. 18, 255–281 (1995).

    Article  Google Scholar 

  119. Parsa, M. et al. Bayesian multi-objective hyperparameter optimization for accurate, fast, and efficient neural network accelerator design. Front. Neurosci. 14, 667 (2020).

    Article  Google Scholar 

  120. Parsa, M., Ankit, A., Ziabari, A. & Roy, K. PABO: pseudo agent-based multi-objective bayesian hyperparameter optimization for efficient neural accelerator design. In 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD) 1–8 (IEEE, 2019).

  121. Parsa, M. et al. Bayesian-based hyperparameter optimization for spiking neuromorphic systems. In 2019 IEEE International Conference on Big Data (Big Data) 4472–4478 (IEEE, 2019).

  122. Indiveri, G. et al. Neuromorphic silicon neuron circuits. Front. Neurosci. 5, 73 (2011).

    Article  Google Scholar 

Download references

Acknowledgements

This material is based on work supported by the US Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, R. Pino, program manager, under contract number DE-AC05-00OR22725. We would like to thank N. Armistead for his aid in creating the graphics for this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

C.D.S. lead the preparation, writing and editing of this manuscript. S.R.K., M.P. and J.P.M. contributed to the neuromorphic hardware description. C.D.S., S.R.K. and M.P. contributed to the machine learning algorithms section and S.R.K., P.D. and B.K. contributed to the non-machine learning section. All authors contributed to the sections on closing the gap between expectations and reality and outlook.

Corresponding author

Correspondence to Catherine D. Schuman.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Computational Science thanks James Aimone, Giacomo Indiveri and Thomas Nowotny for their contribution to the peer review of this work. Handling editor: Fernando Chirigati, in collaboration with the Nature Computational Science team.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Schuman, C.D., Kulkarni, S.R., Parsa, M. et al. Opportunities for neuromorphic computing algorithms and applications. Nat Comput Sci 2, 10–19 (2022). https://doi.org/10.1038/s43588-021-00184-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s43588-021-00184-y

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics