Skip to main content

Transfer Learning

Expanding Sparse Tuning for Low Memory Usage
·2517 words·12 mins· loading · loading
Computer Vision Transfer Learning 🏢 Tsinghua University
SNELL: Sparse tuning with kerNElized LoRA achieves state-of-the-art parameter-efficient fine-tuning performance with drastically reduced memory usage.
Enhancing Domain Adaptation through Prompt Gradient Alignment
·2283 words·11 mins· loading · loading
Machine Learning Transfer Learning 🏢 New York University
Prompt Gradient Alignment (PGA) enhances unsupervised domain adaptation by aligning per-objective gradients in a multi-objective optimization framework, achieving state-of-the-art results.
Efficient Discrepancy Testing for Learning with Distribution Shift
·1471 words·7 mins· loading · loading
Machine Learning Transfer Learning 🏢 University of Texas at Austin
Provably efficient algorithms for learning with distribution shift are introduced, generalizing and improving prior work by achieving near-optimal error rates and offering universal learners for large…
Disentangling and mitigating the impact of task similarity for continual learning
·2158 words·11 mins· loading · loading
Machine Learning Transfer Learning 🏢 Washington University in St Louis
This study reveals that high input similarity paired with low output similarity is detrimental to continual learning, whereas the opposite scenario is relatively benign; offering insights into mitigat…
Diffusion Tuning: Transferring Diffusion Models via Chain of Forgetting
·2524 words·12 mins· loading · loading
Machine Learning Transfer Learning 🏢 Tsinghua University
Diff-Tuning: a simple yet effective approach transfers pre-trained diffusion models to various downstream tasks by leveraging the ‘chain of forgetting’ phenomenon, improving transferability and conver…
Deep Graph Mating
·1581 words·8 mins· loading · loading
Machine Learning Transfer Learning 🏢 University of Sydney
Deep Graph Mating (GRAMA) enables training-free knowledge transfer in GNNs, achieving results comparable to pre-trained models without retraining or labeled data.
Boosting Transferability and Discriminability for Time Series Domain Adaptation
·3675 words·18 mins· loading · loading
AI Generated Machine Learning Transfer Learning 🏢 School of Computer Science and Technology, Harbin Institute of Technology (Shenzhen)
ACON: Adversarial CO-learning Networks enhances time series domain adaptation by cleverly combining temporal and frequency features. Frequency features boost within-domain discriminability, while temp…
Beyond Efficiency: Molecular Data Pruning for Enhanced Generalization
·2607 words·13 mins· loading · loading
AI Generated Machine Learning Transfer Learning 🏢 Chinese Academy of Sciences
MolPeg, a novel molecular data pruning framework, enhances model generalization in transfer learning by using a source-free approach and consistently outperforming other methods, even surpassing full-…
Bayesian-guided Label Mapping for Visual Reprogramming
·3607 words·17 mins· loading · loading
Transfer Learning 🏢 University of Melbourne
Bayesian-guided Label Mapping (BLM) enhances visual reprogramming!
Bayesian Domain Adaptation with Gaussian Mixture Domain-Indexing
·2584 words·13 mins· loading · loading
AI Generated Machine Learning Transfer Learning 🏢 Sun Yat-Sen University
GMDI: a novel Bayesian domain adaptation algorithm significantly improves adaptation by dynamically modeling domain indices using Gaussian Mixture Models, outperforming state-of-the-art methods.
Adversarially Robust Multi-task Representation Learning
·252 words·2 mins· loading · loading
Machine Learning Transfer Learning 🏢 Johns Hopkins University
Multi-task learning boosts adversarial robustness in transfer learning by leveraging diverse source data to build a shared representation, enabling effective learning in data-scarce target tasks, as p…