Text Classification
Soft-Label Integration for Robust Toxicity Classification
·2918 words·14 mins·
loading
·
loading
AI Generated
Natural Language Processing
Text Classification
🏢 Northwestern University
Boosting toxicity classification robustness, this paper introduces a novel bi-level optimization framework integrating crowdsourced soft-labels and GroupDRO to enhance resistance against out-of-distri…
Navigating Extremes: Dynamic Sparsity in Large Output Spaces
·2090 words·10 mins·
loading
·
loading
Natural Language Processing
Text Classification
🏢 Department of Computer Science, Aalto University
SPARTEX achieves memory-efficient extreme multi-label classification by integrating dynamic sparse training with an auxiliary loss function, enabling end-to-end training with millions of labels on com…
Is the MMI Criterion Necessary for Interpretability? Degenerating Non-causal Features to Plain Noise for Self-Rationalization
·1904 words·9 mins·
loading
·
loading
Natural Language Processing
Text Classification
🏢 Huazhong University of Science and Technology
New criterion maximizes remaining discrepancy after rationale removal, treating spurious features as noise, improving rationale extraction.
Continual Learning with Global Alignment
·1784 words·9 mins·
loading
·
loading
Natural Language Processing
Text Classification
🏢 Stony Brook University
Researchers developed a novel continual learning method achieving state-of-the-art performance by aligning data representations across tasks using pre-trained tokens, eliminating the need for experien…
Concentrate Attention: Towards Domain-Generalizable Prompt Optimization for Language Models
·3084 words·15 mins·
loading
·
loading
Natural Language Processing
Text Classification
🏢 Xi'an Jiaotong University
Boost language model performance across domains with ‘Concentration’: a new prompt optimization objective that prioritizes stable, deep-layer attention.