Spiking Neural Networks: The Future of Brain-Inspired Computing
| Spiking Neural Networks: The Future of Brain-Inspired Computing | ||
|   |  | |
| © 2025 by IJETT Journal | ||
| Volume-73 Issue-10 | ||
| Year of Publication : 2025 | ||
| Author : Sales G. Aribe Jr | ||
| DOI : 10.14445/22315381/IJETT-V73I10P104 | ||
How to Cite?
Sales G. Aribe Jr.,"Spiking Neural Networks: The Future of Brain-Inspired Computing", International Journal of Engineering Trends and Technology, vol. 73, no. 10, pp.32-48, 2025. Crossref, https://doi.org/10.14445/22315381/IJETT-V73I10P104
Abstract
Spiking Neural Networks (SNNs) represent the latest generation of neural computation, offering a brain-inspired alternative to conventional Artificial Neural Networks (ANNs). Unlike ANNs, which depend on continuous-valued signals, SNNs operate using distinct spike events, making them inherently more energy-efficient and temporally dynamic. This study presents a comprehensive analysis of SNN design models, training algorithms, and multi-dimensional performance metrics, including accuracy, energy consumption, latency, spike count, and convergence behavior. Key neuron models such as the Leaky Integrate-and-Fire (LIF) and training strategies—including surrogate gradient descent, ANN-to-SNN conversion, and Spike-Timing Dependent Plasticity (STDP)—are examined in depth. Results show that surrogate gradient-trained SNNs closely approximate ANN accuracy (within 1–2%), with faster convergence by the 20th epoch and latency as low as 10 milliseconds. Converted SNNs also achieve competitive performance but require higher spike counts and longer simulation windows. STDP-based SNNs, though slower to converge, exhibit the lowest spike counts and energy consumption (as low as 5 millijoules per inference), making them optimal for unsupervised and low-power tasks. These findings reinforce the suitability of SNNs for energy-constrained, latency-sensitive, and adaptive applications such as robotics, neuromorphic vision, and edge AI systems. While promising, challenges persist in hardware standardization and scalable training. This study concludes that SNNs, with further refinement, are poised to propel the next phase of neuromorphic computing.
Keywords
Artificial Intelligence, Brain-inspired computing, Energy efficiency, Neuromorphic computing, Spiking Neural Network.
References
[1] Kaushik Roy, Akhilesh Jaiswal, and Priyadarshini Panda, “Towards Spike-Based Machine Intelligence with Neuromorphic Computing,” Nature, vol. 575, no. 7784, pp. 607-617, 2019.
 [CrossRef] [Google Scholar] [Publisher Link]
 [2] Wolfgang Maass, “Networks of Spiking Neurons: The Third Generation of Neural Network Models,” Neural Networks, vol. 10, no. 9, pp. 1659-1671, 1997. 
[CrossRef] [Google Scholar] [Publisher Link]
 [3] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton, “Deep Learning,” Nature, vol. 521, no. 7553, pp. 436-444, 2015.
 [CrossRef] [Google Scholar] [Publisher Link]
 [4] Jolitte A. Villaruz, Bobby D. Gerardo, and Ruji P. Medina, “Philippine Stock Exchange Index Forecasting Using a Tuned Artificial Neural Network Model with a Modified Firefly Algorithm,” 2023 IEEE 6th International Conference on Pattern Recognition and Artificial Intelligence (PRAI), Haikou, China, pp. 1039-1044, 2023.
 [CrossRef] [Google Scholar] [Publisher Link]
 [5] Mary Joy D. Viñas et al., “COVID-19 Outbreaks Effect on Air Quality Index: Evidence from Enhanced Artificial Neural Network,” 2023 8th International Conference on Computer and Communication Systems (ICCCS), Guangzhou, China, pp. 1117-1124, 2023. 
 [CrossRef] [Google Scholar] [Publisher Link]
 [6] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton, “Imagenet Classification with Deep Convolutional Neural Networks,” Communications of the ACM, vol. 60, no. 6, pp. 84-90, 2017. 
[CrossRef] [Google Scholar] [Publisher Link]
 [7] Stephen Grossberg, “Recurrent Neural Networks,” Scholarpedia, vol. 8, no. 2, 2013.
 [Google Scholar] 
 [8] Sepp Hochreiter, and Jürgen Schmidhuber, “Long Short-Term Memory,” Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [9] Rahul Dey, and Fathi M. Salem, “Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks,” 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, pp. 1597-1600, 2017. 
[CrossRef] [Google Scholar] [Publisher Link]
 [10] Ashish Vaswani et al., “Attention is All you Need,” Advances in Neural Information Processing Systems, vol. 30, 2017. 
[Google Scholar] [Publisher Link]
 [11] Wulfram Gerstner, and Werner M. Kistler, Spiking Neuron Models: Single Neurons, Populations, Plasticity, 1st ed., Cambridge University Press, 2002.
 [CrossRef] [Google Scholar] [Publisher Link]
 [12] Mike Davies et al., “Loihi: A Neuromorphic Manycore Processor with On-Chip Learning,” IEEE Micro, vol. 38, no. 1, pp. 82-99, 2018.
 [CrossRef] [Google Scholar] [Publisher Link]
 [13] Resmi Cherian, and E. Grace Mary Kanaga, “Unleashing the Potential of Spiking Neural Networks for Epileptic Seizure Detection: A Comprehensive Review,” Neurocomputing, vol. 598, 2024. 
[CrossRef] [Google Scholar] [Publisher Link]
 [14] Samanwoy Ghosh-Dastidar, and Hojjat Adeli, “Third Generation Neural Networks: Spiking Neural Networks,” Advances in Intelligent and Soft Computing, Berlin, Heidelberg, pp. 167-178, 2009. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [15] Gabriele Lagani et al., “Spiking Neural Networks and Bio-Inspired Supervised Deep Learning: A Survey,” arXiv Preprint, pp. 1-31, 2023.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [16] Giacomo Indiveri, and Timothy K. Horiuchi, “Frontiers in Neuromorphic Engineering,” Frontiers in Neuroscience, vol. 5, pp. 1-2, 2011.
 [CrossRef] [Google Scholar] [Publisher Link]
 [17] Sumit Soman, Jayadeva, and Manan Suri, “Recent Trends in Neuromorphic Engineering,” Big Data Analytics, vol. 1, no. 1, pp. 1-16, 2016. 
[CrossRef] [Google Scholar] [Publisher Link]
 [18] Paul A. Merolla, et al., “A Million Spiking-Neuron Integrated Circuit with a Scalable Communication Network and Interface,” Science, vol. 345, no. 6197, pp. 668-673, 2014. 
[CrossRef] [Google Scholar] [Publisher Link]
 [19] Md Bokhtiar Al Zami et al., “Digital Twin in Industries: A Comprehensive Survey,” IEEE Access, vol. 13, pp. 47291-47336, 2025.
 [CrossRef] [Google Scholar] [Publisher Link]
 [20] Peter U. Diehl et al., “Fast-Classifying, High-Accuracy Spiking Deep Networks through Weight and Threshold Balancing,” 2015 International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland, pp. 1-8, 2015.
 [CrossRef] [Google Scholar] [Publisher Link]
 [21] Bodo Rueckauer et al., “Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification,” Frontiers in Neuroscience, vol. 11, pp. 1-12, 2017.
 [CrossRef] [Google Scholar] [Publisher Link]
 [22] Emre O. Neftci, Hesham Mostafa, and Friedemann Zenke, “Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks,” IEEE Signal Processing Magazine, vol. 36, no. 6, pp. 51-63, 2019. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [23] E.M. Izhikevich, “Simple Model of Spiking Neurons,” IEEE Transactions on Neural Networks, vol. 14, no. 6, pp. 1569-1572, 2003. 
[CrossRef] [Google Scholar] [Publisher Link]
 [24] Yuchen Wang et al., “A Universal ANN-to-SNN Framework for Achieving High Accuracy and Low Latency Deep Spiking Neural Networks,” Neural Networks, vol. 174, pp. 1-14, 2024. 
[CrossRef] [Google Scholar] [Publisher Link]
 [25] Benjamin Cramer et al., “The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 33, no. 7, pp. 2744-2757, 2022.
 [CrossRef] [Google Scholar] [Publisher Link]
 [26] Khadeer Ahmed, Brain-Inspired Spiking Neural Networks, Biomimetics, IntechOpen, 2021. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [27] A.L. Hodgkin, and A.F. Huxley, “A Quantitative Description of Membrane Current and its Application to Conduction and Excitation in Nerve,” Bulletin of Mathematical Biology, vol. 52, no. 1-2, pp. 25-71, 1990. 
[CrossRef] [Google Scholar] [Publisher Link]
 [28] Xingting Yao et al., “GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks,” Advances in Neural Information Processing Systems, vol. 35, pp. 32160-32171, 2022. 
[Google Scholar] [Publisher Link]
 [29] Walter Senn, and Jean-Pascal Pfister, “Spike-Timing Dependent Plasticity, Learning Rules,” Encyclopedia of Computational Neuroscience, New York, Springer, pp. 2824-2832, 2015. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [30] Guo-qiang Bi, and Mu-ming Poo, “Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type,” Journal of Neuroscience, vol. 18, no. 24, pp. 10464-10472, 1998. 
[CrossRef] [Google Scholar] [Publisher Link]
 [31] Sung Soo Park, and Young-Seok Choi, “Spiking Neural Networks for Physiological and Speech Signals: A Review,” Biomedical Engineering Letters, vol. 14, no. 5, pp. 943-954, 2024. 
[CrossRef] [Google Scholar] [Publisher Link]
 [32] Simon Thorpe, and Jacques Gautrais, “Rank Order Coding,” Computational Neuroscience, Boston, MA, pp. 113-118, 1998.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [33] Sanaullah et al., “Exploring Spiking Neural Networks: A Comprehensive Analysis of Mathematical Models and Applications,” Frontiers in Computational Neuroscience, vol. 17, pp. 1-20, 2023.
 [CrossRef] [Google Scholar] [Publisher Link]
 [34] Friedemann Zenke, and Surya Ganguli, “SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks,” Neural Computation, vol. 30, no. 6, pp. 1514-1541, 2018. 
[CrossRef] [Google Scholar] [Publisher Link]
 [35] Siddharth Sharma, Simone Sharma, and Anidhya Athaiya, “Activation Functions in Neural Networks,” International Journal of Engineering Applied Sciences and Technology, vol. 4, no. 12, pp. 310-316, 2020. 
[CrossRef] [Google Scholar] [Publisher Link]
 [36] Tomasz Szandała, “Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks,” Studies in Computational Intelligence, Singapore: Springer Singapore, pp. 203-224, 2010. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [37] Sen Lu, and Abhronil Sengupta, “Neuroevolution Guided Hybrid Spiking Neural Network Training,” Frontiers in Neuroscience, vol. 16, pp. 1-11, 2022.
 [CrossRef] [Google Scholar] [Publisher Link]
 [38] Michael Pfeiffer, and Thomas Pfeil, “Deep Learning with Spiking Neurons: Opportunities and Challenges,” Frontiers in Neuroscience, vol. 12, 2018. 
[CrossRef] [Google Scholar] [Publisher Link]
 [39] Youngeun Kim et al., “Exploring Temporal Information Dynamics in Spiking Neural Networks,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 7, pp. 8308-8316, 2023.
 [CrossRef] [Google Scholar] [Publisher Link]
 [40] Amirhossein Tavanaei et al., “Deep Learning in Spiking Neural Networks,” Neural Networks, vol. 111, pp. 47-63, 2019. 
[CrossRef] [Google Scholar] [Publisher Link]
 [41] Jayram Moorkanikara Nageswaran et al., “A Configurable Simulation Environment for the Efficient Simulation of Large-Scale Spiking Neural Networks on Graphics Processors,” Neural Networks, vol. 22, no. 5-6, pp. 791-800, 2009. 
[CrossRef] [Google Scholar] [Publisher Link]
 [42] Hsin-Pai Cheng et al., “Understanding the Design of IBM Neurosynaptic System and Its Tradeoffs: A User Perspective,” Design, Automation & Test in Europe Conference & Exhibition (DATE), Lausanne, Switzerland, pp. 139-144, 2017. 
[CrossRef] [Google Scholar] [Publisher Link]
 [43] Chit-Kwan Lin et al., “Programming Spiking Neural Networks on Intel’s Loihi,” Computer, vol. 51, no. 3, pp. 52-61, 2018. 
[CrossRef] [Google Scholar] [Publisher Link]
 [44] Muhammad Aitsam, Sergio Davies, and Alessandro Di Nuovo, “Neuromorphic Computing for Interactive Robotics: A Systematic Review,” IEEE Access, vol. 10, pp. 122261-122279, 2022. 
[CrossRef] [Google Scholar] [Publisher Link]
 [45] Eunsu Kim, and Youngmin Kim, “Exploring the Potential of Spiking Neural Networks in Biomedical Applications: Advantages, Limitations, and Future Perspectives,” Biomedical Engineering Letters, vol. 14, no. 5, pp. 967-980, 2024. 
[CrossRef] [Google Scholar] [Publisher Link]
 [46] V. Rajakumari, and K.P. Pradhan, “Demonstration of an UltraLow Energy PD-SOI FinFET Based LIF Neuron for SNN,” IEEE Transactions on Nanotechnology, vol. 21, pp. 434-441, 2022.
 [CrossRef] [Google Scholar] [Publisher Link]
 [47] Saeed Reza Kheradpisheh et al., “STDP-based Spiking Deep Convolutional Neural Networks for Object Recognition,” Neural Networks, vol. 99, pp. 56-67, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
 [48] Wenzhe Guo et al., “Towards Efficient Neuromorphic Hardware: Unsupervised Adaptive Neuron Pruning,” Electronics, vol. 9, no. 7, pp. 1-15, 2020. 
[CrossRef] [Google Scholar] [Publisher Link]
 [49] Marc-Oliver Gewaltig, and Markus Diesmann, “Nest (Neural Simulation Tool),” Scholarpedia, vol. 2, no. 4, 2007.
 [Google Scholar]
 [50] Hananel Hazan et al., “BindsNET: A Machine Learning-Oriented Spiking Neural Networks Library in Python,” Frontiers in Neuroinformatics, vol. 12, pp. 1-18, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
 [51] Marcel Stimberg, Romain Brette, and Dan FM Goodman, “Brian 2, An Intuitive and Efficient Neural Simulator,” Elife, vol. 8, pp. 11-41, 2019. 
[CrossRef] [Google Scholar] [Publisher Link]
 [52] Jesper Sjöström, and Wulfram Gerstner, “Spike-Timing Dependent Plasticity,” Scholarpedia, vol. 5, no. 2, 2010. 
[Google Scholar]
 [53] Dominique Debanne, and Mu-Ming Poo, “Spike-Timing Dependent Plasticity Beyond Synapse - Pre- and Post-Synaptic Plasticity of Intrinsic Neuronal Excitability,” Frontiers in Synaptic Neuroscience, vol. 2, pp. 1-6, 2010. 
[CrossRef] [Google Scholar] [Publisher Link] 
[54] Timothée Masquelier, Rudy Guyonneau, and Simon J. Thorpe, “Competitive STDP-Based Spike Pattern Learning,” Neural Computation, vol. 21, no. 5, pp. 1259-1276, 2009.
 [CrossRef] [Google Scholar] [Publisher Link]
 [55] Slawomir Koziel, David Echeverría Ciaurri, and Leifur Leifsson, “Surrogate-Based Methods,” Studies in Computational Intelligence, Berlin, Heidelberg, pp. 33-59, 2011.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [56] Tehreem Syed et al., “Exploring Optimized Spiking Neural Network Architectures for Classification Tasks on Embedded Platforms,” Sensors, vol. 21, no. 9, pp. 1-25, 2021. 
[CrossRef] [Google Scholar] [Publisher Link]
 [57] Shivam S. Kadam, Amol C. Adamuthe, and Ashwini Patil, “CNN Model for Image Classification on MNIST and Fashion-MNIST Dataset,” Journal of Scientific Research, vol. 64, no. 2, pp. 374-384, 2020.
 [CrossRef] [Google Scholar] [Publisher Link]
 [58] Yu Hu et al., “Hand Gesture Recognition System using the Dynamic Vision Sensor,” 2022 5th International Conference on Circuits, Systems and Simulation (ICCSS), Nanjing, China, pp. 102-110, 2022.
 [CrossRef] [Google Scholar] [Publisher Link]
 [59] Marc-Oliver Gewaltig, and Markus Diesmann, “Nest (Neural Simulation Tool),” Scholarpedia, vol. 2, no. 4, 2007.
 [Google Scholar]
 [60] Lars Niedermeier et al., “CARLsim 6: An Open Source Library for Large-Scale, Biologically Detailed Spiking Neural Network Simulation,” 2022 International Joint Conference on Neural Networks (IJCNN), Padua, Italy, pp. 1-10, 2022. 
[CrossRef] [Google Scholar] [Publisher Link]
 [61] Ting-Shuo Chou et al., “CARLsim 4: An Open Source Library for Large Scale, Biologically Detailed Spiking Neural Network Simulation using Heterogeneous Clusters,” 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, pp. 1-8, 2018. 
[CrossRef] [Google Scholar] [Publisher Link]
 [62] Michael Beyeler et al., “CARLsim 3: A User-Friendly and Highly Optimized Library for the Creation of Neurobiologically Detailed Spiking Neural Networks,” 2015 International Joint Conference on Neural Networks (IJCNN), Killarney, Ireland, pp. 1-8, 2015. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [63] Abderazek Ben Abdallah, and Khanh N. Dang, “Comprehensive Review of Neuromorphic Systems,” Neuromorphic Computing Principles and Organization, Springer Nature Switzerland, pp. 275-303, 2025. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [64] Hariprasad Kannan, Nikos Komodakis, and Nikos Paragios, “Newton-Type Methods for Inference in Higher-Order Markov Random Fields,” 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, pp. 7224-7233, 2017.
 [CrossRef] [Google Scholar] [Publisher Link]
 [65] Wei Bai et al., “Network Analysis of Anxiety and Depressive Symptoms Among Nursing Students during the Covid-19 Pandemic,” Journal of Affective Disorders, vol. 294, pp. 753-760, 2021. 
[CrossRef] [Google Scholar] [Publisher Link]
 [66] Arnon Amir et al., “A Low Power, Fully Event-Based Gesture Recognition System,” 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, pp. 7388-7397, 2017.
 [CrossRef] [Google Scholar] [Publisher Link]
 [67] Jun Haeng Lee, Tobi Delbruck, and Michael Pfeiffer, “Training Deep Spiking Neural Networks Using Backpropagation,” Frontiers in Neuroscience, vol. 10, pp. 1-13, 2016. 
[CrossRef] [Google Scholar] [Publisher Link]
 [68] Sanaullah et al., “Evaluation of Spiking Neural Nets-Based Image Classification Using the Runtime Simulator RAVSim,” International Journal of Neural Systems, vol. 33, no. 9, pp. 1-19, 2023. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [69] Yunpeng Huang et al., “Advancing Transformer Architecture in Long-Context Large Language Models: A Comprehensive Survey,” arXiv Preprint, pp. 1-40, 2023.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [70] Rasoul Hosseinzadeh, and Mahdi Sadeghzadeh, “Attention Mechanisms in Transformers: A General Survey,” Journal of Artificial Intelligence & Data Mining (JAIDM), vol. 13, no. 3, pp. 359-368, 2025. 
[CrossRef] [Google Scholar] [Publisher Link]
 [71] Sayed Mahbub Hasan Amiri et al., “The Carbon Cost of Conversation, Sustainability in the Age of Language Models,” arXiv Preprint, pp. 1-22, 2025. 
[CrossRef] [Google Scholar] [Publisher Link] 
 [72] David Patterson et al., “Carbon Emissions and Large Neural Network Training,” arXiv Preprint, pp. 1-22, 2021.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [73] Zhanglu Yan, Zhenyu Bai, and Weng-Fai Wong, “Reconsidering the Energy Efficiency of Spiking Neural Networks,” arXiv Preprint, pp. 1-11, 2024.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [74] Sayeed Shafayet Chowdhury, Nitin Rathi, and Kaushik Roy, “One Timestep is All You Need: Training Spiking Neural Networks with Ultra Low Latency,” arXiv Preprint, pp. 1-17, 2021.
 [CrossRef] [Google Scholar] [Publisher Link] 
 [75] Giacomo Indiveri, and Shih-Chii Liu, “Memory and Information Processing in Neuromorphic Systems,” Proceedings of the IEEE, vol. 103, no. 8, pp. 1379-1397, 2015. 
[CrossRef] [Google Scholar] [Publisher Link]
