Skip to main content

Optimization

Piecewise-Stationary Bandits with Knapsacks
·380 words·2 mins· loading · loading
AI Theory Optimization 🏢 National University of Singapore
A novel inventory reserving algorithm achieves near-optimal performance for bandit problems with knapsacks in piecewise-stationary settings, offering a competitive ratio of O(log(nmax/min)).
Persistent Homology for High-dimensional Data Based on Spectral Methods
·8023 words·38 mins· loading · loading
AI Generated AI Theory Optimization 🏢 Tübingen AI Center
Spectral distances on k-nearest neighbor graphs enable robust topological analysis of high-dimensional noisy data using persistent homology, overcoming limitations of Euclidean distance.
Penalty-based Methods for Simple Bilevel Optimization under Hölderian Error Bounds
·1969 words·10 mins· loading · loading
Machine Learning Optimization 🏢 Fudan University
This paper proposes penalty-based methods for simple bilevel optimization, achieving (ε, εβ)-optimal solutions with improved complexity under Hölderian error bounds.
Paths to Equilibrium in Games
·265 words·2 mins· loading · loading
AI Theory Optimization 🏢 University of Toronto
In n-player games, a satisficing path always exists leading from any initial strategy profile to a Nash equilibrium by allowing unsatisfied players to explore suboptimal strategies.
Parameter-free Clipped Gradient Descent Meets Polyak
·1918 words·10 mins· loading · loading
Machine Learning Optimization 🏢 Kyoto University
Parameter-free optimization is revolutionized! Inexact Polyak Stepsize achieves the same convergence rate as clipped gradient descent but without any hyperparameter tuning, saving time and computatio…
Parameter Symmetry and Noise Equilibrium of Stochastic Gradient Descent
·1617 words·8 mins· loading · loading
AI Theory Optimization 🏢 Massachusetts Institute of Technology
SGD’s dynamics are precisely characterized by the interplay of noise and symmetry in loss functions, leading to unique, initialization-independent fixed points.
Overcoming Brittleness in Pareto-Optimal Learning Augmented Algorithms
·1976 words·10 mins· loading · loading
AI Theory Optimization 🏢 Sorbonne University
This research introduces a novel framework that overcomes the brittleness of Pareto-optimal learning-augmented algorithms by enforcing smoothness in performance using user-specified profiles and devel…
Outlier-Robust Distributionally Robust Optimization via Unbalanced Optimal Transport
·2832 words·14 mins· loading · loading
AI Generated AI Theory Optimization 🏢 KTH Royal Institute of Technology
Outlier-robust distributionally robust optimization achieved via a novel Unbalanced Optimal Transport (UOT) distance, improving efficiency and accuracy.
OT4P: Unlocking Effective Orthogonal Group Path for Permutation Relaxation
·2531 words·12 mins· loading · loading
AI Generated AI Theory Optimization 🏢 School of Artificial Intelligence, Jilin University
OT4P: a novel temperature-controlled differentiable transformation efficiently relaxes permutation matrices onto the orthogonal group for gradient-based optimization.
Optimizing the coalition gain in Online Auctions with Greedy Structured Bandits
·1842 words·9 mins· loading · loading
AI Theory Optimization 🏢 Department of Statistics, University of Oxford
Two novel algorithms, Local-Greedy and Greedy-Grid, optimize coalition gain in online auctions with limited observations, achieving constant regret and problem-independent guarantees while respecting …
Optimization Can Learn Johnson Lindenstrauss Embeddings
·412 words·2 mins· loading · loading
AI Theory Optimization 🏢 University of Texas at Austin
Optimization can learn optimal Johnson-Lindenstrauss embeddings, avoiding the limitations of randomized methods and achieving comparable theoretical guarantees.
Optimization Algorithm Design via Electric Circuits
·3889 words·19 mins· loading · loading
AI Theory Optimization 🏢 Stanford University
Design provably convergent optimization algorithms swiftly using electric circuit analogies; a novel methodology automating discretization for diverse algorithms.
Optimal Scalarizations for Sublinear Hypervolume Regret
·1664 words·8 mins· loading · loading
AI Theory Optimization 🏢 Google DeepMind
Optimal multi-objective optimization achieved via hypervolume scalarization, offering sublinear regret bounds and outperforming existing methods.
Optimal Parallelization of Boosting
·228 words·2 mins· loading · loading
AI Theory Optimization 🏢 Aarhus University
This paper closes the performance gap in parallel boosting algorithms by presenting improved lower bounds and a novel algorithm matching these bounds, settling the parallel complexity of sample-optima…
Optimal Multiclass U-Calibration Error and Beyond
·366 words·2 mins· loading · loading
AI Generated AI Theory Optimization 🏢 University of Southern California
This paper proves the minimax optimal U-calibration error is Θ(√KT) for online multiclass prediction, resolving an open problem and showing logarithmic error is achievable for specific loss functions.
Optimal Hypothesis Selection in (Almost) Linear Time
·1628 words·8 mins· loading · loading
AI Theory Optimization 🏢 Rice University
This paper presents the first almost linear-time algorithm achieving the optimal accuracy parameter for hypothesis selection, solving a decades-long open problem.
Optimal Flow Matching: Learning Straight Trajectories in Just One Step
·2531 words·12 mins· loading · loading
AI Generated Machine Learning Optimization 🏢 Skolkovo Institute of Science and Technology
Optimal Flow Matching (OFM) learns straight trajectories for generative modeling in a single step, eliminating iterative processes and improving efficiency.
Optimal Algorithms for Online Convex Optimization with Adversarial Constraints
·1266 words·6 mins· loading · loading
AI Theory Optimization 🏢 Tata Institute of Fundamental Research
Optimal algorithms for online convex optimization with adversarial constraints are developed, achieving O(√T) regret and Õ(√T) constraint violation—a breakthrough in the field.
Optimal Algorithms for Learning Partitions with Faulty Oracles
·1450 words·7 mins· loading · loading
AI Theory Optimization 🏢 University of Chicago
Optimal algorithms for learning partitions are designed, achieving minimum query complexity even with up to l faulty oracle responses.
Optimal Algorithms for Augmented Testing of Discrete Distributions
·1848 words·9 mins· loading · loading
AI Theory Optimization 🏢 Rice University
Leveraging predictions, this research presents novel algorithms for uniformity, identity, and closeness testing of discrete distributions, achieving information-theoretically optimal sample complexity…