Skip to main content

Machine Learning

Linear Uncertainty Quantification of Graphical Model Inference
·1866 words·9 mins· loading · loading
Machine Learning Active Learning 🏒 Key Laboratory of Trustworthy Distributed Computing and Service (MoE), Beijing University of Posts and Telecommunications
LinUProp: Linearly scalable uncertainty quantification for graphical models, achieving higher accuracy with lower labeling budgets!
Linear Transformers are Versatile In-Context Learners
·1783 words·9 mins· loading · loading
Machine Learning Optimization 🏒 Google Research
Linear transformers surprisingly learn intricate optimization algorithms, even surpassing baselines on noisy regression problems, showcasing their unexpected learning capabilities.
Light Unbalanced Optimal Transport
·2953 words·14 mins· loading · loading
Machine Learning Optimization 🏒 Skolkovo Institute of Science and Technology
LightUnbalancedOptimalTransport: A fast, theoretically-justified solver for continuous unbalanced optimal transport problems, enabling efficient analysis of large datasets with imbalanced classes.
LFME: A Simple Framework for Learning from Multiple Experts in Domain Generalization
·2799 words·14 mins· loading · loading
Machine Learning Domain Generalization 🏒 MBZUAI
LFME: A novel framework improves domain generalization by training multiple expert models alongside a target model, using logit regularization for enhanced performance.
Leveraging Separated World Model for Exploration in Visually Distracted Environments
·2320 words·11 mins· loading · loading
Machine Learning Reinforcement Learning 🏒 School of Artificial Intelligence, Nanjing University, China
SeeX, a novel bi-level optimization framework, effectively tackles the challenge of exploration in visually cluttered environments by training a separated world model to extract relevant information a…
Leveraging partial stragglers within gradient coding
·1641 words·8 mins· loading · loading
Machine Learning Federated Learning 🏒 Iowa State University
New gradient coding protocols efficiently leverage partial results from slow worker nodes, accelerating distributed training by approximately 2x and significantly improving accuracy.
Leveraging Drift to Improve Sample Complexity of Variance Exploding Diffusion Models
·1640 words·8 mins· loading · loading
Machine Learning Deep Learning 🏒 John Hopcroft Center for Computer Science
Drifted VESDE: Faster convergence, efficient sampling for variance-exploding diffusion models!
Leveraging Contrastive Learning for Enhanced Node Representations in Tokenized Graph Transformers
·1604 words·8 mins· loading · loading
Machine Learning Representation Learning 🏒 Huazhong University of Science and Technology
GCFormer, a novel graph Transformer, enhances node representation learning by employing a hybrid token generator and contrastive learning, outperforming existing methods on various datasets.
Learning World Models for Unconstrained Goal Navigation
·2782 words·14 mins· loading · loading
Machine Learning Reinforcement Learning 🏒 Rutgers University
MUN: a novel goal-directed exploration algorithm significantly improves world model reliability and policy generalization in sparse-reward goal-conditioned RL, enabling efficient navigation across div…
Learning via Surrogate PAC-Bayes
·1481 words·7 mins· loading · loading
Machine Learning Meta Learning 🏒 Inria
Surrogate PAC-Bayes Learning (SuPAC) efficiently optimizes generalization bounds by iteratively optimizing surrogate training objectives, enabling faster and more scalable learning for complex models.
Learning Versatile Skills with Curriculum Masking
·2688 words·13 mins· loading · loading
AI Generated Machine Learning Reinforcement Learning 🏒 Shanghai Jiao Tong University
CurrMask: a novel curriculum masking paradigm for offline RL, achieving superior zero-shot and fine-tuning performance by dynamically adjusting masking schemes during pretraining, enabling versatile s…
Learning to Shape In-distribution Feature Space for Out-of-distribution Detection
·1662 words·8 mins· loading · loading
Machine Learning Representation Learning 🏒 Hong Kong Baptist University
Deterministically shaping in-distribution feature space solves OOD detection’s distributional assumption challenge, leading to superior performance.
Learning to Predict Structural Vibrations
·4242 words·20 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 Institute of Computer Science, University of Gâttingen
Deep learning predicts structural vibrations faster than traditional methods, reducing noise in airplanes, cars, and buildings, as shown by a new benchmark and frequency-query operator network.
Learning to Embed Distributions via Maximum Kernel Entropy
·1819 words·9 mins· loading · loading
AI Generated Machine Learning Unsupervised Learning 🏒 Dipartimento Di Matematica, Universit Gli Studi Di Genova
Learn optimal data-dependent distribution kernels via Maximum Kernel Entropy, eliminating manual kernel selection and boosting performance on various downstream tasks.
Learning to Balance Altruism and Self-interest Based on Empathy in Mixed-Motive Games
·2604 words·13 mins· loading · loading
Machine Learning Reinforcement Learning 🏒 Peking University
AI agents learn to balance helpfulness and self-preservation using empathy to gauge social relationships and guide reward sharing.
Learning the Optimal Policy for Balancing Short-Term and Long-Term Rewards
·1775 words·9 mins· loading · loading
Machine Learning Reinforcement Learning 🏒 ByteDance Research
A novel Decomposition-based Policy Learning (DPPL) method optimally balances short-term and long-term rewards, even with interrelated objectives, by transforming the problem into intuitive subproblems…
Learning the Latent Causal Structure for Modeling Label Noise
·2995 words·15 mins· loading · loading
Machine Learning Semi-Supervised Learning 🏒 University of Sydney
Learning latent causal structures improves label noise modeling by accurately estimating noise transition matrices without relying on similarity-based assumptions, leading to state-of-the-art classifi…
Learning the Infinitesimal Generator of Stochastic Diffusion Processes
·1835 words·9 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 CSML, Istituto Italiano Di Tecnologia
Learn infinitesimal generators of stochastic diffusion processes efficiently via a novel energy-based risk functional, overcoming the unbounded nature of the generator and providing learning bounds in…
Learning symmetries via weight-sharing with doubly stochastic tensors
·2346 words·12 mins· loading · loading
Machine Learning Deep Learning 🏒 Amsterdam Machine Learning Lab
Learn data symmetries directly from data with flexible weight-sharing using learnable doubly stochastic tensors!
Learning Successor Features the Simple Way
·9069 words·43 mins· loading · loading
Machine Learning Reinforcement Learning 🏒 Google DeepMind
Learn deep Successor Features (SFs) directly from pixels, efficiently and without representation collapse, using a novel, simple method combining TD and reward prediction loss!