AI Theory
Generalization Analysis for Label-Specific Representation Learning
·269 words·2 mins·
loading
·
loading
AI Theory
Representation Learning
π’ Southeast University
Researchers derived tighter generalization bounds for label-specific representation learning (LSRL) methods, improving understanding of LSRL’s success and offering guidance for future algorithm develo…
Generalizablity of Memorization Neural Network
·1319 words·7 mins·
loading
·
loading
AI Theory
Generalization
π’ Chinese Academy of Sciences
Unlocking deep learning’s generalization mystery, this research pioneers a theoretical understanding of memorization neural network generalizability, revealing critical network structural requirements…
General bounds on the quality of Bayesian coresets
·1364 words·7 mins·
loading
·
loading
AI Theory
Optimization
π’ University of British Columbia
New theoretical bounds on Bayesian coreset approximation errors enable efficient large-scale Bayesian inference, overcoming prior limitations and improving coreset construction methods.
Fundamental Convergence Analysis of Sharpness-Aware Minimization
·2234 words·11 mins·
loading
·
loading
AI Theory
Optimization
π’ Ho Chi Minh City University of Education
This research establishes fundamental convergence properties for the widely-used SAM optimization algorithm, significantly advancing our theoretical understanding and practical applications.
Functionally Constrained Algorithm Solves Convex Simple Bilevel Problem
·310 words·2 mins·
loading
·
loading
AI Theory
Optimization
π’ Tsinghua University
Near-optimal algorithms solve convex simple bilevel problems by reformulating them into functionally constrained problems, achieving near-optimal convergence rates.
FUGAL: Feature-fortified Unrestricted Graph Alignment
·2409 words·12 mins·
loading
·
loading
AI Theory
Optimization
π’ IIT Delhi
FUGAL: a groundbreaking graph alignment method surpassing state-of-the-art accuracy without compromising efficiency by directly aligning adjacency matrices.
From Linear to Linearizable Optimization: A Novel Framework with Applications to Stationary and Non-stationary DR-submodular Optimization
·1591 words·8 mins·
loading
·
loading
AI Theory
Optimization
π’ McGill University
A novel framework extends optimization algorithms from linear/quadratic functions to a broader class of ‘upper-linearizable’ functions, providing a unified approach for concave and DR-submodular optim…
From Causal to Concept-Based Representation Learning
·1733 words·9 mins·
loading
·
loading
AI Theory
Representation Learning
π’ Carnegie Mellon University
This paper introduces a novel geometric approach to concept-based representation learning, provably recovering interpretable concepts from diverse data without strict causal assumptions or many interv…
First-Order Methods for Linearly Constrained Bilevel Optimization
·392 words·2 mins·
loading
·
loading
AI Theory
Optimization
π’ Weizmann Institute of Science
First-order methods conquer linearly constrained bilevel optimization, achieving near-optimal convergence rates and enhancing high-dimensional applicability.
FERERO: A Flexible Framework for Preference-Guided Multi-Objective Learning
·2495 words·12 mins·
loading
·
loading
AI Theory
Optimization
π’ Rensselaer Polytechnic Institute
FERERO, a novel framework, tackles multi-objective learning by efficiently finding preference-guided Pareto solutions using flexible preference modeling and convergent algorithms.
FEEL-SNN: Robust Spiking Neural Networks with Frequency Encoding and Evolutionary Leak Factor
·2784 words·14 mins·
loading
·
loading
AI Generated
AI Theory
Robustness
π’ College of Computer Science and Technology, Zhejiang University
FEEL-SNN enhances spiking neural network robustness by mimicking biological visual attention and adaptive leak factors, resulting in improved resilience against noise and attacks.
Feedback control guides credit assignment in recurrent neural networks
·1962 words·10 mins·
loading
·
loading
AI Theory
Optimization
π’ Imperial College London
Brain-inspired recurrent neural networks learn efficiently by using feedback control to approximate optimal gradients, enabling rapid movement corrections and efficient adaptation to persistent errors…
Faster Repeated Evasion Attacks in Tree Ensembles
·4214 words·20 mins·
loading
·
loading
AI Generated
AI Theory
Robustness
π’ KU Leuven
Speed up repeated evasion attacks on tree ensembles by 36x using feature perturbation insights!
Faster Differentially Private Top-$k$ Selection: A Joint Exponential Mechanism with Pruning
·1673 words·8 mins·
loading
·
loading
AI Theory
Privacy
π’ University of Waterloo
Faster differentially private top-k selection achieved via a novel joint exponential mechanism with pruning, reducing time complexity from O(dk) to O(d+kΒ²/Ιlnd).
Faster Algorithms for User-Level Private Stochastic Convex Optimization
·1097 words·6 mins·
loading
·
loading
AI Theory
Privacy
π’ University of Wisconsin-Madison
Faster algorithms achieve optimal excess risk in user-level private stochastic convex optimization, overcoming limitations of prior methods without restrictive assumptions.
Faster Accelerated First-order Methods for Convex Optimization with Strongly Convex Function Constraints
·1492 words·8 mins·
loading
·
loading
AI Theory
Optimization
π’ Shanghai University of Finance and Economics
Faster primal-dual algorithms achieve order-optimal complexity for convex optimization with strongly convex constraints, improving convergence rates and solving large-scale problems efficiently.
Fast Tree-Field Integrators: From Low Displacement Rank to Topological Transformers
·3010 words·15 mins·
loading
·
loading
AI Generated
AI Theory
Optimization
π’ Google DeepMind
Fast Tree-Field Integrators (FTFIs) revolutionize graph processing by enabling polylog-linear time computation for integrating tensor fields on trees, providing significant speedups for various machin…
Fast T2T: Optimization Consistency Speeds Up Diffusion-Based Training-to-Testing Solving for Combinatorial Optimization
·2382 words·12 mins·
loading
·
loading
AI Theory
Optimization
π’ Shanghai Jiao Tong University
Fast T2T: Optimization Consistency Boosts Diffusion-Based Combinatorial Optimization!
Fast Rates in Stochastic Online Convex Optimization by Exploiting the Curvature of Feasible Sets
·1343 words·7 mins·
loading
·
loading
AI Theory
Optimization
π’ University of Tokyo
This paper introduces a novel approach for fast rates in online convex optimization by exploiting the curvature of feasible sets, achieving logarithmic regret bounds under specific conditions.
Fast Proxy Experiment Design for Causal Effect Identification
·2057 words·10 mins·
loading
·
loading
AI Theory
Causality
π’ EPFL, Switzerland
This paper presents efficient algorithms for designing cost-optimal proxy experiments to identify causal effects, significantly improving upon prior methods.