Jun. 28, 2025
Our work has been accepted by ICCV 2025 🎉.
SpiLiFormer (Spiking Transformer with Lateral Inhibition) is a novel brain-inspired spiking transformer architecture designed to enhance the performance and robustness of spiking neural networks (SNNs).
Inspired by the lateral inhibition mechanism in the human visual system—which helps the brain focus on salient regions by suppressing responses from neighboring neurons—SpiLiFormer introduces two new attention modules:
-
FF-LiDiff Attention (Feedforward-pathway Lateral Differential Inhibition): Inspired by short-range inhibition in the retina, this module reduces distraction in shallow network stages by differentially inhibiting attention responses.
-
FB-LiDiff Attention (Feedback-pathway Lateral Differential Inhibition): Inspired by long-range cortical inhibition, this module incorporates feedback to refine attention allocation in deeper network stages.
- Release the checkpoint of our models
- Release the code of our work
If you use our code or data in this repo or find our work helpful, please consider giving a citation:
@misc{zheng2025spiliformerenhancingspikingtransformers,
title={SpiLiFormer: Enhancing Spiking Transformers with Lateral Inhibition},
author={Zeqi Zheng and Yanchen Huang and Yingchao Yu and Zizheng Zhu and Junfeng Tang and Zhaofei Yu and Yaochu Jin},
year={2025},
eprint={2503.15986},
archivePrefix={arXiv},
primaryClass={cs.NE},
url={https://arxiv.org/abs/2503.15986},
}