Transfer Learning
Universality in Transfer Learning for Linear Models
·1460 words·7 mins·
loading
·
loading
AI Generated
Machine Learning
Transfer Learning
π’ California Institute of Technology
Linear model transfer learning achieves universal generalization error improvements, depending only on first and second-order target statistics, and defying Gaussian assumptions.
Transfer Learning for Latent Variable Network Models
·1891 words·9 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ University of Texas at Austin
This paper presents efficient algorithms for transfer learning in latent variable network models, achieving vanishing error under specific conditions, and attaining minimax optimal rates for stochasti…
Towards Understanding Extrapolation: a Causal Lens
·2076 words·10 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Carnegie Mellon University
This work unveils a causal lens on extrapolation, offering theoretical guarantees for accurate predictions on out-of-support data, even with limited target samples.
To Learn or Not to Learn, That is the Question β A Feature-Task Dual Learning Model of Perceptual Learning
·1867 words·9 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Peking University
A new dual-learning model resolves the paradox of perceptual learning, showing how task-based and feature-based learning interact to produce both specific and transferable improvements in sensory perc…
The Impact of Geometric Complexity on Neural Collapse in Transfer Learning
·1870 words·9 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Google Research
Lowering a neural network’s geometric complexity during pre-training enhances neural collapse and improves transfer learning, especially in few-shot scenarios.
TFGDA: Exploring Topology and Feature Alignment in Semi-supervised Graph Domain Adaptation through Robust Clustering
·1822 words·9 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Zhejiang University
TFGDA: Leveraging graph topology and feature alignment for superior semi-supervised domain adaptation.
Style Adaptation and Uncertainty Estimation for Multi-Source Blended-Target Domain Adaptation
·1973 words·10 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ South China Normal University
SAUE: A novel multi-source blended-target domain adaptation approach using style adaptation and uncertainty estimation to improve model robustness and accuracy.
Schrodinger Bridge Flow for Unpaired Data Translation
·3752 words·18 mins·
loading
·
loading
Transfer Learning
π’ Google DeepMind
Accelerate unpaired data translation with SchrΓΆdinger Bridge Flow, a novel algorithm solving optimal transport problems efficiently without repeatedly training models!
Revealing Distribution Discrepancy by Sampling Transfer in Unlabeled Data
·2081 words·10 mins·
loading
·
loading
AI Generated
Machine Learning
Transfer Learning
π’ School of Computing, Macquarie University
I-Div accurately quantifies distribution discrepancy between training and test datasets without test labels, enabling reliable hypothesis applicability evaluation in complex scenarios.
Reinforced Cross-Domain Knowledge Distillation on Time Series Data
·2657 words·13 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Institute for Infocomm Research, A*STAR, Singapore
Reinforced Cross-Domain Knowledge Distillation (RCD-KD) dynamically selects target samples for efficient knowledge transfer from a complex teacher model to a compact student model, achieving superior …
Optimal Aggregation of Prediction Intervals under Unsupervised Domain Shift
·1968 words·10 mins·
loading
·
loading
AI Generated
Machine Learning
Transfer Learning
π’ Princeton University
This paper introduces a novel method for creating highly accurate and narrow prediction intervals even when data distribution shifts unexpectedly, significantly improving machine learning model reliab…
On $f$-Divergence Principled Domain Adaptation: An Improved Framework
·1963 words·10 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Tongji University
Improved unsupervised domain adaptation framework achieves superior performance via refined f-divergence and novel f-domain discrepancy, enabling faster algorithms and tighter generalization bounds.
Neural decoding from stereotactic EEG: accounting for electrode variability across subjects
·1818 words·9 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Stanford University
Scalable SEEG decoding model, seegnificant, leverages transformers to decode behavior across subjects despite electrode variability, achieving high accuracy and transfer learning capability.
Monte Carlo Tree Search based Space Transfer for Black Box Optimization
·2970 words·14 mins·
loading
·
loading
Transfer Learning
π’ Nanjing University
MCTS-transfer: Iteratively refining Bayesian optimization via Monte Carlo tree search for efficient black-box optimization using transfer learning.
Large Scale Transfer Learning for Tabular Data via Language Modeling
·2834 words·14 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ University of Washington
TABULA-8B, a novel language model for tabular prediction, achieves state-of-the-art zero-shot and few-shot performance across various benchmarks, exceeding existing methods by 5-15 percentage points.
Inductive biases of multi-task learning and finetuning: multiple regimes of feature reuse
·3248 words·16 mins·
loading
·
loading
AI Generated
Machine Learning
Transfer Learning
π’ Columbia University
Multi-task learning and finetuning show surprising feature reuse biases, including a novel ’nested feature selection’ regime where finetuning prioritizes a sparse subset of pretrained features, signif…
Handling Learnwares from Heterogeneous Feature Spaces with Explicit Label Exploitation
·2061 words·10 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ National Key Laboratory for Novel Software Technology, Nanjing University, China
This paper enhances learnware dock systems by using model outputs to improve heterogeneous learnware management, enabling effective task handling even without perfectly matched models.
GFT: Graph Foundation Model with Transferable Tree Vocabulary
·4791 words·23 mins·
loading
·
loading
AI Generated
Machine Learning
Transfer Learning
π’ University of Notre Dame
GFT: a novel graph foundation model using transferable computation trees as tokens, improving generalization and reducing negative transfer in graph learning.
Geodesic Optimization for Predictive Shift Adaptation on EEG data
·2001 words·10 mins·
loading
·
loading
Transfer Learning
π’ Inria
GOPSA: a novel geodesic optimization method significantly improves cross-site age prediction from EEG data by jointly handling shifts in data and predictive variables.
Fine-Tuning is Fine, if Calibrated
·4429 words·21 mins·
loading
·
loading
Machine Learning
Transfer Learning
π’ Ohio State University
Fine-tuning pre-trained models often degrades performance on unseen classes. This work reveals that the problem stems from logit scale discrepancies, not feature loss, and shows that post-processing c…