Introduces LoZA (LongCat ZigZag Attention), a sparse attention scheme for efficient long-context scaling in LongCat models.

Paper

arXiv: 2512.23966

attentionefficiencyscaling

Related