Papers
arxiv:2601.16622

E2Former-V2: On-the-Fly Equivariant Attention with Linear Activation Memory

Published on Jan 23
Authors:
,
,
,
,
,
,
,
,
,

Abstract

E2Former-V2 scales equivariant graph neural networks through sparse tensor operations and hardware-aware attention mechanisms, achieving significant performance improvements on 3D atomic system modeling.

AI-generated summary

Equivariant Graph Neural Networks (EGNNs) have become a widely used approach for modeling 3D atomistic systems. However, mainstream architectures face critical scalability bottlenecks due to the explicit construction of geometric features or dense tensor products on every edge. To overcome this, we introduce E2Former-V2, a scalable architecture that integrates algebraic sparsity with hardware-aware execution. We first propose Equivariant Axis-Aligned Sparsification (EAAS). EAAS builds on Wigner-6j convolution by exploiting an SO(3) rightarrow SO(2) change of basis to transform computationally expensive dense tensor contractions into efficient, sparse parity re-indexing operations. Building on this representation, we introduce On-the-Fly Equivariant Attention, a fully node-centric mechanism implemented via a custom fused Triton kernel. By eliminating materialized edge tensors and maximizing SRAM utilization, our kernel achieves a 20times improvement in TFLOPS compared to standard implementations. Extensive experiments on the SPICE and OMol25 datasets demonstrate that E2Former-V2 maintains comparable predictive performance while notably accelerating inference. This work demonstrates that large equivariant transformers can be trained efficiently using widely accessible GPU platforms. The code is avalible at https://github.com/IQuestLab/UBio-MolFM/tree/e2formerv2.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2601.16622
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 1

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2601.16622 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2601.16622 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.