🏢 Department of Computer Science Johns Hopkins University
FuseMoE: Mixture-of-Experts Transformers for Fleximodal Fusion
·2650 words·13 mins·
loading
·
loading
Multimodal Learning
Multimodal Understanding
🏢 Department of Computer Science Johns Hopkins University
FuseMoE, a novel mixture-of-experts transformer, efficiently fuses diverse and incomplete multimodal data, achieving superior predictive performance via a unique Laplace gating function.