Skip to main content

Oral AI Theories

2024

Weisfeiler and Leman Go Loopy: A New Hierarchy for Graph Representational Learning
·3118 words·15 mins· loading · loading
AI Theory Representation Learning 🏢 Munich Center for Machine Learning
This paper introduces r-lWL, a new graph isomorphism test hierarchy that surpasses the limitations of the Weisfeiler-Leman test by counting cycles up to length r+2, and its GNN counterpart, r-lMPNN, w…
Trading Place for Space: Increasing Location Resolution Reduces Contextual Capacity in Hippocampal Codes
·1950 words·10 mins· loading · loading
AI Theory Representation Learning 🏢 University of Pennsylvania
Boosting hippocampal spatial resolution surprisingly shrinks its contextual memory capacity, revealing a crucial trade-off between precision and context storage.
Optimal Parallelization of Boosting
·228 words·2 mins· loading · loading
AI Theory Optimization 🏢 Aarhus University
This paper closes the performance gap in parallel boosting algorithms by presenting improved lower bounds and a novel algorithm matching these bounds, settling the parallel complexity of sample-optima…
Neural Pfaffians: Solving Many Many-Electron Schrödinger Equations
·2649 words·13 mins· loading · loading
AI Theory Optimization 🏢 Technical University of Munich
Neural Pfaffians revolutionize many-electron Schrödinger equation solutions by using fully learnable neural wave functions based on Pfaffians, achieving unprecedented accuracy and generalizability acr…
Learning diffusion at lightspeed
·1990 words·10 mins· loading · loading
AI Theory Optimization 🏢 ETH Zurich
JKOnet* learns diffusion processes at unprecedented speed and accuracy by directly minimizing a simple quadratic loss function, bypassing complex bilevel optimization problems.
Identification and Estimation of the Bi-Directional MR with Some Invalid Instruments
·2386 words·12 mins· loading · loading
AI Theory Causality 🏢 Beijing Technology and Business University
PReBiM algorithm accurately estimates bi-directional causal effects from observational data, even with invalid instruments, using a novel cluster fusion approach.
Generalization Error Bounds for Two-stage Recommender Systems with Tree Structure
·386 words·2 mins· loading · loading
AI Theory Generalization 🏢 University of Science and Technology of China
Two-stage recommender systems using tree structures achieve better generalization with more branches and harmonized training data distributions across stages.
Do Finetti: On Causal Effects for Exchangeable Data
·1344 words·7 mins· loading · loading
AI Theory Causality 🏢 Max Planck Institute
Causal inference revolutionized: New framework estimates causal effects from exchangeable data, enabling simultaneous causal discovery and effect estimation via the Do-Finetti algorithm.