🏢 University of Chicago
When are dynamical systems learned from time series data statistically accurate?
·2869 words·14 mins·
loading
·
loading
AI Theory
Generalization
🏢 University of Chicago
Learned dynamical systems often fail to capture true physical behavior; this work introduces an ergodic theoretic approach to improve statistical accuracy by incorporating Jacobian information during …
Variance estimation in compound decision theory under boundedness
·323 words·2 mins·
loading
·
loading
AI Theory
Optimization
🏢 University of Chicago
Unlocking the optimal variance estimation rate in compound decision theory under bounded means, this paper reveals a surprising (log log n/log n)² rate and introduces a rate-optimal cumulant-based est…
Training Binary Neural Networks via Gaussian Variational Inference and Low-Rank Semidefinite Programming
·1655 words·8 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 University of Chicago
VISPA, a novel BNN training framework using Gaussian variational inference and low-rank SDP, achieves state-of-the-art accuracy on various benchmarks.
Schur Nets: exploiting local structure for equivariance in higher order graph neural networks
·1825 words·9 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 University of Chicago
Schur Nets boost higher-order GNNs by efficiently exploiting local graph structure for automorphism equivariance, achieving improved performance without the computational burden of traditional methods…
S-SOS: Stochastic Sum-Of-Squares for Parametric Polynomial Optimization
·1441 words·7 mins·
loading
·
loading
AI Theory
Optimization
🏢 University of Chicago
S-SOS: A new algorithm solves complex, parameterized polynomial problems with provable convergence, enabling efficient solutions for high-dimensional applications like sensor network localization.
Overfitting Behaviour of Gaussian Kernel Ridgeless Regression: Varying Bandwidth or Dimensionality
·1931 words·10 mins·
loading
·
loading
AI Generated
AI Theory
Generalization
🏢 University of Chicago
Ridgeless regression, surprisingly, generalizes well even with noisy data if dimension scales sub-polynomially with sample size.
Optimal Algorithms for Learning Partitions with Faulty Oracles
·1450 words·7 mins·
loading
·
loading
AI Theory
Optimization
🏢 University of Chicago
Optimal algorithms for learning partitions are designed, achieving minimum query complexity even with up to l faulty oracle responses.
Off-policy estimation with adaptively collected data: the power of online learning
·240 words·2 mins·
loading
·
loading
AI Generated
AI Theory
Causality
🏢 University of Chicago
This paper develops novel finite-sample bounds for off-policy linear treatment effect estimation with adaptively collected data, proposing online learning algorithms to improve estimation accuracy and…
Latent Intrinsics Emerge from Training to Relight
·1743 words·9 mins·
loading
·
loading
Image Generation
🏢 University of Chicago
A novel data-driven relighting model achieves state-of-the-art accuracy by learning latent intrinsic and extrinsic scene properties, even recovering albedo without explicit supervision.
AgentPoison: Red-teaming LLM Agents via Poisoning Memory or Knowledge Bases
·2659 words·13 mins·
loading
·
loading
Natural Language Processing
Large Language Models
🏢 University of Chicago
AGENTPOISON: A novel backdoor attack compromises LLM agents by poisoning their memory or knowledge bases, achieving high success rates with minimal performance impact.