Deep learning has made significant progress over the past few years in a plethora of cognitive applications. However, despite such advancements, an often underestimated question is the amount of computational resources and energy consumption that these platforms typically require for running on hardware. It is clear today that there exists orders of magnitude difference in the computational requirements of deep learning frameworks and that of the biological brain – the inspiration behind such algorithms. We are striving to reduce this efficiency gap by decoding the functionality of the biological brain and incorporating such insights in the underlying computational primitives, learning methodologies and hardware substrate. We believe exploring the bio-plausibility route can lead us to efficient machine intelligence.
We are actively working on neuromorphic hardware design based on emerging technologies. With future neuromorphic platforms expected to have almost 10,000 synapses per neuron, nanoelectronic devices mimicking synapse and neural functionalities at low terminal operating voltages have become imperative. To that end, we are investigating ferromagnetic and ferroelectric devices as a potential pathway for enabling neuromorphic hardware due to their inherent advantages of low voltage operation, enhanced reliability and unlimited endurance.
In addition to research at the hardware front, we are also investigating opportunities in the domain of neuromorphic sensors and algorithms. Neuromorphic hardware needs to be interfaced with event-driven sensors that naturally fit with its event-based computing schemes. Simultaneously, appropriate learning methodologies, that are driven by sparse neural events, need to be explored for such “brain-like hardware” to be useful for machine learning tasks.
A few selected research projects are highlighted below:
Neuromorphic Algorithms
Neuromorphic computing platforms rely on computational primitives where the “artificial” neurons and synapses mimic their biological counterparts with much greater degree of bio-fidelity. Termed as “Spiking Neural Networks” (SNNs) – they are a significant shift from standard deep learning frameworks since they process information temporally with binary spike based information. While the power consumption benefits of neuromorphic computing platforms are apparent due to sparse event-driven computations, scaling such computing schemes to large-scale machine learning tasks has been challenging. We are exploring algorithmic techniques to generate SNNs with deep architectures and with state-of-the-art recognition results on complex machine learning tasks like natural language processing. In addition to spiking neural network algorithms, our research extends to a large spectrum of bio-plausible algorithms – dynamic networks, equilibrium based learning dynamics, probabilistic AI, among others to enable robust and accurate ML systems capable of lifelong learning. Check out our recent research directions in this domain in PSU News!
Relevant Publications:
[1] Malyaban Bal, Abhronil Sengupta, “SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation”, AAAI Conference on Artificial Intelligence (AAAI), 2024 (Acceptance rate ~23%). [Code]
[2] Sen Lu, Abhronil Sengupta, “Deep Unsupervised Learning Using Spike-Timing-Dependent Plasticity”, Neuromorphic Computing and Engineering, Vol. 4, Iss. 2, pp. 024004, 2024.
[3] Jiaqi Lin, Malyaban Bal, Abhronil Sengupta, “Scaling SNNs Trained Using Equilibrium Propagation to Convolutional Architectures”, ACM International Conference on Neuromorphic Systems (ICONS), 2024.
[4] Malyaban Bal, Yi Jiang, Abhronil Sengupta, “Exploring Extreme Quantization in Spiking Language Models”, ACM International Conference on Neuromorphic Systems (ICONS), 2024.
[5] Yi Jiang, Sen Lu, Abhronil Sengupta, “Stochastic Spiking Neural Networks with First-to-Spike Coding”, ACM International Conference on Neuromorphic Systems (ICONS), 2024.
[6] Malyaban Bal, Abhronil Sengupta, “Equilibrium-Based Learning Dynamics in Spiking Architectures”, IEEE International Symposium on Circuits and Systems (ISCAS), 2024 (Invited Special Session Paper).
[7] Malyaban Bal, Abhronil Sengupta, “Sequence Learning using Equilibrium Propagation”, International Joint Conference on Artificial Intelligence (IJCAI), 2023 (Oral presentation, Acceptance rate ~15%). [Code]
[8] Sen Lu, Abhronil Sengupta, “Neuroevolution Guided Hybrid Spiking Neural Network Training”, Frontiers in Neuroscience, Vol. 16, No. 838523, 2022.
[9] Sen Lu, Abhronil Sengupta, “Hybrid Neuromorphic Systems: An Algorithm-Application-Hardware-Neuroscience Co-Design Perspective”, IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), 2022.
[10] Hai-Tian Zhang*, Tae Joon Park*, A. N. M. Nafiul Islam*, Dat S. J. Tran*, Sukriti Manna*, Qi Wang*, Sandip Mondal, Haoming Yu, Suvo Banik, Shaobo Cheng, Hua Zhou, Sampath Gamage, Sayantan Mahapatra, Yimei Zhu, Yohannes Abate, Nan Jiang, Subramanian K. R. S. Sankaranarayanan, Abhronil Sengupta, Christof Teuscher, Shriram Ramanathan, “Reconfigurable perovskite nickelate electronics for artificial intelligence”, Science, Vol. 375, Iss. 6580, pp. 533-539, 2022 (Featured in PSU News, Featured in News & Views of Nature Materials, Featured in IEEE Spectrum, * denotes equal first author contribution).
[11] Sonali Singh, Anup Sarma, Sen Lu, Abhronil Sengupta, Mahmut T. Kandemir, Emre Neftci, Vijaykrishnan Narayanan, Chita R. Das, “Skipper: Enabling efficient SNN training through activation-checkpointing and time-skipping”, IEEE/ACM International Symposium on Microarchitecture (MICRO), 2022.
[12] Sen Lu, Abhronil Sengupta, “Exploring the Connection Between Binary and Spiking Neural Networks”, Frontiers in Neuroscience, Vol. 14, No. 535, 2020. [Code]
Astromorphic Engineering
Current neuromorphic computing architectures have mainly focused on emulation of bio-plausible computational models for neuron and synapse – but have not focused on other computational units of the biological brain that might contribute to cognition. Over the past few years, there has been increasing evidence that glial cells, and in particular, astrocytes play an important role in multitude of brain functions. We are drawing inspiration and insights from computational neuroscience regarding functionalities of glial cells to explore their role in the fault-tolerant and temporal information binding capacities of neuromorphic systems. The project was featured recently in PSU News.
Relevant Publications:
[1] Zhuangyu Han, A. N. M. Nafiul Islam, Abhronil Sengupta, “Astromorphic Self-Repair of Neuromorphic Hardware Systems”, AAAI Conference on Artificial Intelligence (AAAI), 2023 (Oral presentation, Acceptance rate ~19%). [Code]
[2] Umang Garg, Kezhou Yang, Abhronil Sengupta, “Emulation of Astrocyte Induced Neural Phase Synchrony in Spin-Orbit Torque Oscillator Neurons”, Frontiers in Neuroscience, Vol. 15, No. 699632, 2021.
[3] Mehul Rastogi, Sen Lu, Nafiul Islam, Abhronil Sengupta, “On the Self-Repair Role of Astrocytes in STDP Enabled Unsupervised SNNs”, Frontiers in Neuroscience, Vol. 14, No. 603796, 2021.
Ferroic Material Based Neural Networks
For the implementation of bio-plausible neural networks, the core computing kernel is a weighted synaptic summation of inputs followed by neural processing along with various key biological properties like neural spiking, synaptic plasticity, temporal dynamics, among others. We have been exploring ferromagnetic and ferroelectric neuro-synaptic devices that emulate such characteristics through their intrinsic physics at low operating energies. Research extends from fundamental experimental investigations, device proposals, circuit and system design with a cross-layer hardware-algorithm co-design spirit.
Relevant Publications:
[1] Arnob Saha, Bibhas Manna, Sen Lu, Zhouhang Jiang, Kai Ni, Abhronil Sengupta, “Device Feasibility Analysis of Multi-level FeFETs for Neuromorphic Computing“, IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), 2024.
[2] A N M Nafiul Islam, Kai Ni, Abhronil Sengupta, “Cross-Layer Optimizations for Ferroelectric Neuromorphic Computing”, IEEE International Midwest Symposium on Circuits and Systems (MWSCAS), 2023. (Invited Special Session Paper)
[3] Kezhou Yang, Abhronil Sengupta, “Leveraging Voltage-Controlled Magnetic Anisotropy to Solve Sneak Path Issues in Crossbar Arrays”, IEEE Transactions on Electron Devices, Vol. 70, Iss. 4, pp. 2021 – 2027, 2023.
[4] Arnob Saha*, A. N. M. Nafiul Islam*, Zijian Zhao, Shan Deng, Kai Ni, Abhronil Sengupta, “Intrinsic synaptic plasticity of ferroelectric field effect transistors for online learning”, Applied Physics Letters, Vol. 119, Iss. 13, pp. 133701, 2021 (* denotes equal first author contribution, DOE EFRC 3DFeM Highlight).
[5] Wilson Yanez, Yongxi Ou, Run Xiao, Jahyun Koo, Jacob T. Held, Supriya Ghosh, Jeffrey Rable, Timothy Pillsbury, Enrique González Delgado, Kezhou Yang, Juan Chamorro, Alexander J. Grutter, Patrick Quarterman, Anthony Richardella, Abhronil Sengupta, Tyrel McQueen, Julie A. Borchers, K. Andre Mkhoyan, Binghai Yan, and Nitin Samarth, “Spin and charge interconversion in Dirac semimetal thin films“, Physical Review Applied, Vol. 16, Iss. 5, pp. 054031, 2021 (Editor’s Suggestion).
[6] Sonali Singh, Anup Sarma, Nicholas Jao, Ashutosh Pattnaik, Sen Lu, Kezhou Yang, Abhronil Sengupta, Vijaykrishnan Narayanan, Chita Das, “NEBULA: A Neuromorphic Spin-Based Ultra-Low Power Architecture for SNNs and ANNs”, IEEE/ACM International Symposium on Computer Architecture (ISCA), 2020.
Bayesian Machine Learning
Uncertainty plays a key role in real-time machine learning. While Bayesian deep learning has shown promise to serve as a pathway for enabling Probabilistic Machine Learning, the algorithms have been primarily developed without any insights regarding the underlying hardware implementation. Bayesian techniques are more computationally expensive than their non-Bayesian counterparts, thereby limiting their training and deployment in resource-constrained environments like wearables and mobile edge devices. We are exploring neuromorphic hardware designs that leverage device level non-idealities as a computing resource for probabilistic AI hardware.
Relevant Publications:
[1] Bibhas Manna, Arnob Saha, Zhouhang Jiang, Kai Ni, Abhronil Sengupta, “Variation-Resilient FeFET-Based In-Memory Computing Leveraging Probabilistic Deep Learning”, IEEE Transactions on Electron Devices, Vol. 71, Iss. 5, pp. 2963 – 2969, 2024.
[2] Kezhou Yang, Akul Malhotra, Sen Lu, Abhronil Sengupta, “All-Spin Bayesian Neural Networks”, IEEE Transactions on Electron Devices, Vol. 67, Iss. 3, pp. 1340 – 1347, 2020.
[3] Akul Malhotra, Sen Lu, Kezhou Yang, Abhronil Sengupta, “Exploiting Oxide Based Resistive RAM Variability for Bayesian Neural Network Hardware Design”, IEEE Transactions on Nanotechnology, Vol. 19, pp. 328 – 331, 2020.
Stochastic Computing
As dimensions start scaling, it is expected that bit resolution offered by emerging neuromorphic device technologies would also become limited. Further, probabilistic fluctuations observed in the switching dynamics of nanomagnets/ferroelectrics would begin to become prominent. In order to leverage from the underlying device stochasticity, we recently proposed state-compressed stochastic spiking neural networks where probabilistic binary state updates replace the multi-bit representation of synapses and neurons. Our proposal explores stochastic neural inference as well as unsupervised stochastic learning in synapses that can be potentially implemented using emerging devices.
Relevant Publications:
[1] A. N. M. Nafiul Islam, Kezhou Yang, Amit K. Shukla, Pravin Khanal, Bowei Zhou, Wei-Gang Wang, Abhronil Sengupta, “Hardware in Loop Learning with Spin Stochastic Neurons”, Advanced Intelligent Systems, Vol. 6, Iss. 7, pp. 2300805, 2024.
[2] A. N. M. Nafiul Islam, Arnob Saha, Zhouhang Jiang, Kai Ni, Abhronil Sengupta, “Hybrid Stochastic Synapses Enabled by Scaled Ferroelectric Field-effect Transistors”, Applied Physics Letters, Vol. 122, Iss. 12, pp. 123701, 2023.
[3] Kezhou Yang, Dhuruva Priyan G M, Abhronil Sengupta, “Leveraging Probabilistic Switching in Superparamagnets for Temporal Information Encoding in Neuromorphic Systems”, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, Vol. 42, Iss. 10, pp. 3464-3468, 2023.
[4] Sunbin Deng, Tae Joon Park, Haoming Yu, Arnob Saha, A. N. M. Nafiul Islam, Qi Wang, Abhronil Sengupta, Shriram Ramanathan, “Hydrogenated VO2 Bits for Probabilistic Computing”, IEEE Electron Device Letters, Vol. 44, Iss. 10, pp. 1776-1779, 2023.
[5] Haoming Yu, A. N. M. Nafiul Islam, Sandip Mondal, Abhronil Sengupta, Shriram Ramanathan, “Switching Dynamics in Vanadium Dioxide-Based Stochastic Thermal Neurons”, IEEE Transactions on Electron Devices, Vol. 69, Iss. 6, pp. 3135 – 3141, 2022.
[6] Kezhou Yang, Abhronil Sengupta, “Stochastic magnetoelectric neuron for temporal information encoding”, Applied Physics Letters, Vol. 116, Iss. 4, pp. 043701, 2020.
[7] Amogh Agrawal, Indranil Chakraborty, Deboleena Roy, Utkarsh Saxena, Saima Sharmin, Yong Shim, Gopalakrishnan Srinivasan, Chamika Liyanagedera, Abhronil Sengupta, Kaushik Roy, “Revisiting Stochastic Computing in the Era of Nano-scale Non-volatile Technologies”, IEEE Transactions on Very Large Scale Integration Systems, Vol. 28, Iss. 12, pp. 2481 – 2494, 2020. (Invited Keynote Paper).
Application Drivers
While SNN algorithms have shown initial promise for various standard machine learning tasks ranging from spatial navigation to object detection, the true potential of SNNs will be realized for application domains that inherently involve sparse, asynchronous, spatio-temporal data patterns. Inspired by this property, we are exploring application drivers that provide a natural fit to SNN algorithms thereby leading to drastic reductions of computational effort and power consumption. Recent areas of exploration are event-driven sensors, power system disturbance classification in smart grid, cybersecurity, among others.
Relevant Publications:
[1] Malyaban Bal, George M. Nishibuchi, Suhas Chelian, Srini Vasan, Abhronil Sengupta, “Bio-plausible Hierarchical Semi-Supervised Learning for Intrusion Detection”, ACM International Conference on Neuromorphic Systems (ICONS), 2023.
[2] Wyler Zahm, Tyler Stern, Malyaban Bal, Abhronil Sengupta, Aswin Jose, Suhas Chelian, Srini Vasan, “Cyber-Neuro RT: Real-time Neuromorphic Cybersecurity”, Procedia Computer Science, Vol. 213, pp. 536-545, 2022.
[3] Sonali Singh, Anup Sarma, Sen Lu, Abhronil Sengupta, Vijaykrishnan Narayanan, Chita Das, “Gesture-SNN: Co-optimizing accuracy, latency and energy of SNNs for neuromorphic vision sensors”, ACM/IEEE International Symposium on Low Power Electronics and Design (ISLPED), 2021.
[4] Kaveri Mahapatra, Sen Lu, Abhronil Sengupta, Nilanjan Ray Chaudhuri, “Power System Disturbance Classification with Online Event-Driven Neuromorphic Computing”, IEEE Transactions on Smart Grid, Vol. 12, Iss. 3, pp. 2343 – 2354, 2021.