There is a newer version of the record available.

Published January 19, 2026 | Version v2
Working paper Open

Hybrid Spiking Language Model: Combining Spike Counts and Membrane Potentials for Energy-Efficient and Noise-Robust Character Prediction

Authors/Creators

Description

I propose a novel character-level language model using Spiking Neural Networks (SNNs) that combines both spike counts and membrane potentials for output prediction. Unlike conventional SNN approaches that only use spike counts, this hybrid method leverages the analog information contained in membrane potentials.

Key Findings (v2):
- SNN achieves BEST perplexity (PPL=9.90) vs DNN (11.28) and LSTM (15.67)
- 14.7× more energy-efficient through sparse computation (only 7.6% of neurons fire)
- 39.7% quality improvement from hybrid (spike + membrane) approach
- Extreme compressibility: 80% neuron pruning and 4-bit quantization still work
- 8× memory compression with minimal quality loss
- Noise robust: No degradation at 30% input noise

Source code: https://github.com/hafufu-stack/snn-language-model

Files

paper_snn_lm_v2.pdf

Files (198.5 kB)

Name Size Download all
md5:7bc2c93b40402a64d4c550f16e566098
198.5 kB Preview Download

Additional details

Software