🏢 Pengcheng Laboratory
QKFormer: Hierarchical Spiking Transformer using Q-K Attention
·2062 words·10 mins·
loading
·
loading
Image Classification
🏢 Pengcheng Laboratory
QKFormer: A groundbreaking spiking transformer achieving 85.65% ImageNet accuracy using a linear-complexity, energy-efficient Q-K attention mechanism.