Papers
arxiv:2503.03379

Prosperity: Accelerating Spiking Neural Networks via Product Sparsity

Published on Apr 2, 2025
Authors:
,
,
,
,
,
,

Abstract

Product Sparsity enhances SNN efficiency by leveraging combinatorial similarities in matrix operations to reduce redundant computations, achieving significant performance and energy improvements over traditional bit sparsity methods.

AI-generated summary

Spiking Neural Networks (SNNs) are highly efficient due to their spike-based activation, which inherently produces bit-sparse computation patterns. Existing hardware implementations of SNNs leverage this sparsity pattern to avoid wasteful zero-value computations, yet this approach fails to fully capitalize on the potential efficiency of SNNs. This study introduces a novel sparsity paradigm called Product Sparsity, which leverages combinatorial similarities within matrix multiplication operations to reuse the inner product result and reduce redundant computations. Product Sparsity significantly enhances sparsity in SNNs without compromising the original computation results compared to traditional bit sparsity methods. For instance, in the SpikeBERT SNN model, Product Sparsity achieves a density of only 1.23% and reduces computation by 11times, compared to bit sparsity, which has a density of 13.19%. To efficiently implement Product Sparsity, we propose Prosperity, an architecture that addresses the challenges of identifying and eliminating redundant computations in real-time. Compared to prior SNN accelerator PTB and the A100 GPU, Prosperity achieves an average speedup of 7.4times and 1.8times, respectively, along with energy efficiency improvements of 8.0times and 193times, respectively. The code for Prosperity is available at https://github.com/dubcyfor3/Prosperity.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2503.03379
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2503.03379 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2503.03379 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2503.03379 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.