Skip to main content

Meta Learning

Transformers are Minimax Optimal Nonparametric In-Context Learners
·1461 words·7 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 University of Tokyo
Transformers excel at in-context learning by leveraging minimax-optimal nonparametric learning, achieving near-optimal risk with sufficient pretraining data diversity.
Theoretical Investigations and Practical Enhancements on Tail Task Risk Minimization in Meta Learning
·3609 words·17 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 College of Science, National University of Defense Technology
This research enhances meta-learning robustness by theoretically grounding and practically improving tail-risk minimization, achieving improved fast adaptation in the task space.
SPARKLE: A Unified Single-Loop Primal-Dual Framework for Decentralized Bilevel Optimization
·1927 words·10 mins· loading · loading
Machine Learning Meta Learning 🏒 Peking University
SPARKLE: A single-loop primal-dual framework unifies decentralized bilevel optimization, enabling flexible heterogeneity-correction and mixed update strategies for improved convergence.
On the Stability and Generalization of Meta-Learning
·1358 words·7 mins· loading · loading
Machine Learning Meta Learning 🏒 Johns Hopkins University
This paper introduces uniform meta-stability for meta-learning, providing tighter generalization bounds for convex and weakly-convex problems, addressing computational limitations of existing algorith…
On the Identifiability of Hybrid Deep Generative Models: Meta-Learning as a Solution
·1881 words·9 mins· loading · loading
Machine Learning Meta Learning 🏒 Rochester Institute of Technology
Meta-learning solves hybrid deep generative model unidentifiability!
Model Based Inference of Synaptic Plasticity Rules
·2178 words·11 mins· loading · loading
Machine Learning Meta Learning 🏒 Janelia Research Campus
New computational method infers complex brain learning rules from experimental data, revealing active forgetting in reward learning.
Meta-Learning Universal Priors Using Non-Injective Change of Variables
·2169 words·11 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 University of Minnesota
MetaNCoV: Learn data-driven priors via non-injective change of variables for enhanced few-shot learning.
Memory-Efficient Gradient Unrolling for Large-Scale Bi-level Optimization
·3095 words·15 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 National University of Singapore
FGΒ²U: a novel memory-efficient algorithm for unbiased stochastic approximation of meta-gradients in large-scale bi-level optimization, showing superior performance across diverse tasks.
Linear Regression using Heterogeneous Data Batches
·1554 words·8 mins· loading · loading
Meta Learning 🏒 Google Research
New algorithm efficiently solves linear regression with heterogeneous data batches, handling diverse input distributions and achieving high accuracy with fewer samples.
Learning via Surrogate PAC-Bayes
·1481 words·7 mins· loading · loading
Machine Learning Meta Learning 🏒 Inria
Surrogate PAC-Bayes Learning (SuPAC) efficiently optimizes generalization bounds by iteratively optimizing surrogate training objectives, enabling faster and more scalable learning for complex models.
In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness
·1771 words·9 mins· loading · loading
Meta Learning 🏒 University of Texas at Austin
Softmax attention in transformers adapts its attention window to function Lipschitzness and noise, enabling efficient in-context learning.
First-Order Minimax Bilevel Optimization
·1619 words·8 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 University at Buffalo
Two novel first-order algorithms, FOSL and MemCS, efficiently solve multi-block minimax bilevel optimization problems, significantly improving performance in deep AUC maximization and robust meta-lear…
FasMe: Fast and Sample-efficient Meta Estimator for Precision Matrix Learning in Small Sample Settings
·2135 words·11 mins· loading · loading
Machine Learning Meta Learning 🏒 Monash University
FasMe: a novel meta-learning approach delivers fast and sample-efficient precision matrix estimation, surpassing existing methods in accuracy and speed for small sample datasets.
Fairness-Aware Meta-Learning via Nash Bargaining
·2445 words·12 mins· loading · loading
Machine Learning Meta Learning 🏒 Virginia Tech
Nash bargaining resolves hypergradient conflicts in fairness-aware meta-learning, boosting model performance and fairness.
Discovering plasticity rules that organize and maintain neural circuits
·1657 words·8 mins· loading · loading
Machine Learning Meta Learning 🏒 University of Washington
AI discovers robust, biologically-plausible plasticity rules that self-organize and maintain neural circuits’ sequential activity, even with synaptic turnover.
Boosting Generalization in Parametric PDE Neural Solvers through Adaptive Conditioning
·3736 words·18 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 Sorbonne Université
GEPS enhances parametric PDE solver generalization by using adaptive conditioning, achieving superior performance with limited data.
An Accelerated Algorithm for Stochastic Bilevel Optimization under Unbounded Smoothness
·406 words·2 mins· loading · loading
Machine Learning Meta Learning 🏒 George Mason University
AccBO: A new accelerated algorithm achieves O(Ρ⁻³) oracle complexity for stochastic bilevel optimization with unbounded smoothness, significantly improving upon existing O(Ρ⁻⁴) methods.
Advancing Open-Set Domain Generalization Using Evidential Bi-Level Hardest Domain Scheduler
·3296 words·16 mins· loading · loading
Machine Learning Meta Learning 🏒 Karlsruhe Institute of Technology
EBiL-HaDS, a novel adaptive domain scheduler, significantly boosts open-set domain generalization by strategically sequencing training domains based on model reliability assessments.
Addressing Hidden Confounding with Heterogeneous Observational Datasets for Recommendation
·2627 words·13 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏒 Peking University
MetaDebias tackles hidden confounding in recommender systems using heterogeneous observational data, achieving state-of-the-art performance without expensive RCT data.
A Metalearned Neural Circuit for Nonparametric Bayesian Inference
·2042 words·10 mins· loading · loading
Machine Learning Meta Learning 🏒 Princeton University
Metalearning a neural circuit mimics nonparametric Bayesian inference, enabling fast, accurate, open-set classification.