Skip to main content

AI Theory

Efficient Policy Evaluation Across Multiple Different Experimental Datasets
·1615 words·8 mins· loading · loading
AI Theory Causality 🏢 Purdue University
This paper presents novel graphical criteria and estimators for accurately evaluating policy effectiveness across multiple experimental datasets, even when data distributions differ.
Efficient Graph Matching for Correlated Stochastic Block Models
·2079 words·10 mins· loading · loading
AI Theory Optimization 🏢 Northwestern University
Efficient algorithm achieves near-perfect graph matching in correlated stochastic block models, resolving a key open problem and enabling improved community detection.
Efficient Combinatorial Optimization via Heat Diffusion
·2280 words·11 mins· loading · loading
AI Theory Optimization 🏢 Fudan University
Heat Diffusion Optimization (HeO) framework efficiently solves combinatorial optimization problems by enabling information propagation through heat diffusion, outperforming existing methods.
Efficient and Private Marginal Reconstruction with Local Non-Negativity
·1932 words·10 mins· loading · loading
AI Generated AI Theory Privacy 🏢 University of Massachusetts, Amherst
Efficiently and privately reconstructing marginal queries from noisy data using residuals improves accuracy of existing differential privacy mechanisms.
Efficient $
hi$-Regret Minimization with Low-Degree Swap Deviations in Extensive-Form Games
·570 words·3 mins· loading · loading
AI Generated AI Theory Optimization 🏢 Carnegie Mellon University
New efficient algorithms minimize regret in extensive-form games by cleverly using low-degree swap deviations and a relaxed fixed-point concept, improving correlated equilibrium computation.
Efficiency of the First-Price Auction in the Autobidding World
·465 words·3 mins· loading · loading
AI Theory Optimization 🏢 Google Research
First-price auction efficiency in autobidding plummets to 45.7% with mixed bidders, but machine-learned advice restores optimality.
ECLipsE: Efficient Compositional Lipschitz Constant Estimation for Deep Neural Networks
·2852 words·14 mins· loading · loading
AI Theory Robustness 🏢 Purdue University
ECLipsE: A novel compositional approach drastically accelerates Lipschitz constant estimation for deep neural networks, achieving speedups of thousands of times compared to the state-of-the-art while …
Dynamic Service Fee Pricing under Strategic Behavior: Actions as Instruments and Phase Transition
·2204 words·11 mins· loading · loading
AI Generated AI Theory Optimization 🏢 MIT
This research introduces novel algorithms to dynamically price third-party platform service fees under strategic buyer behavior, achieving optimal revenue with a theoretically proven regret bound.
Dueling over Dessert, Mastering the Art of Repeated Cake Cutting
·2291 words·11 mins· loading · loading
AI Theory Fairness 🏢 University of Maryland
Repeated cake-cutting game reveals that strategic players can exploit myopic opponents, but equitable outcomes are achievable through specific strategies.
Dual-Perspective Activation: Efficient Channel Denoising via Joint Forward-Backward Criterion for Artificial Neural Networks
·1941 words·10 mins· loading · loading
AI Theory Interpretability 🏢 Zhejiang University
Dual-Perspective Activation (DPA) efficiently denoises ANN channels by jointly using forward and backward propagation criteria, improving sparsity and accuracy.
Dual Lagrangian Learning for Conic Optimization
·2010 words·10 mins· loading · loading
AI Generated AI Theory Optimization 🏢 String
Dual Lagrangian Learning (DLL) revolutionizes conic optimization by leveraging machine learning to efficiently learn high-quality dual-feasible solutions, achieving 1000x speedups over traditional sol…
DropEdge not Foolproof: Effective Augmentation Method for Signed Graph Neural Networks
·2471 words·12 mins· loading · loading
AI Theory Representation Learning 🏢 Huazhong Agricultural University
SGA: A novel framework boosts Signed Graph Neural Network performance by addressing graph sparsity and unbalanced triangles, achieving up to 26.2% F1-micro improvement.
Drago: Primal-Dual Coupled Variance Reduction for Faster Distributionally Robust Optimization
·1908 words·9 mins· loading · loading
AI Theory Optimization 🏢 University of Washington
DRAGO: A novel primal-dual algorithm delivers faster, state-of-the-art convergence for distributionally robust optimization.
DOPPLER: Differentially Private Optimizers with Low-pass Filter for Privacy Noise Reduction
·2545 words·12 mins· loading · loading
AI Theory Privacy 🏢 University of Southern California
DOPPLER, a novel low-pass filter, significantly enhances differentially private (DP) optimizer performance by reducing the impact of privacy noise, bridging the gap between DP and non-DP training.
Do Finetti: On Causal Effects for Exchangeable Data
·1344 words·7 mins· loading · loading
AI Theory Causality 🏢 Max Planck Institute
Causal inference revolutionized: New framework estimates causal effects from exchangeable data, enabling simultaneous causal discovery and effect estimation via the Do-Finetti algorithm.
Divide-and-Conquer Predictive Coding: a structured Bayesian inference algorithm
·1683 words·8 mins· loading · loading
AI Theory Representation Learning 🏢 Department of Psychology, Vanderbilt University
Divide-and-conquer predictive coding (DCPC) revolutionizes structured Bayesian inference by achieving superior performance in high-dimensional problems while remaining biologically plausible.
Diversity Is Not All You Need: Training A Robust Cooperative Agent Needs Specialist Partners
·1922 words·10 mins· loading · loading
AI Theory Robustness 🏢 VISTEC
Training robust cooperative AI agents requires diverse and specialized training partners, but existing methods often produce overfit partners. This paper proposes novel methods using reinforcement and…
DistrictNet: Decision-aware learning for geographical districting
·2460 words·12 mins· loading · loading
AI Theory Optimization 🏢 Polytechnique Montreal
DISTRICTNET: A novel decision-aware learning approach drastically cuts geographical districting costs by integrating combinatorial optimization and graph neural networks.
Distributionally Robust Performative Prediction
·2341 words·11 mins· loading · loading
AI Generated AI Theory Optimization 🏢 University of Michigan
This research introduces distributionally robust performative prediction, offering a new solution concept (DRPO) that minimizes performative risk even with misspecified distribution maps, ensuring rob…
Distributional regression: CRPS-error bounds for model fitting, model selection and convex aggregation
·348 words·2 mins· loading · loading
AI Theory Optimization 🏢 University of Franche-Comté
This paper provides the first statistical learning guarantees for distributional regression using CRPS, offering concentration bounds for model fitting, selection, and convex aggregation, applicable t…