๐ข Imperial College London
UV-free Texture Generation with Denoising and Geodesic Heat Diffusion
ยท2448 wordsยท12 minsยท
loading
ยท
loading
Computer Vision
3D Vision
๐ข Imperial College London
UV3-TeD generates high-quality 3D textures directly on object surfaces using a novel diffusion probabilistic model, eliminating UV-mapping limitations.
Universal Sample Coding
ยท2065 wordsยท10 minsยท
loading
ยท
loading
AI Generated
Machine Learning
Federated Learning
๐ข Imperial College London
Universal Sample Coding revolutionizes data transmission by reducing bits needed to communicate multiple samples from an unknown distribution, achieving significant improvements in federated learning โฆ
Transition Constrained Bayesian Optimization via Markov Decision Processes
ยท2420 wordsยท12 minsยท
loading
ยท
loading
Machine Learning
Reinforcement Learning
๐ข Imperial College London
This paper presents a novel BayesOpt framework that incorporates Markov Decision Processes to optimize black-box functions with transition constraints, overcoming limitations of traditional methods.
Towards Universal Mesh Movement Networks
ยท2599 wordsยท13 minsยท
loading
ยท
loading
๐ข Imperial College London
Universal Mesh Movement Network (UM2N) revolutionizes mesh movement for PDE solvers, enabling zero-shot adaptation to diverse problems and significantly accelerating simulations with improved accuracyโฆ
Theoretical Foundations of Deep Selective State-Space Models
ยท379 wordsยท2 minsยท
loading
ยท
loading
AI Theory
Generalization
๐ข Imperial College London
Deep learningโs sequence modeling is revolutionized by selective state-space models (SSMs)! This paper provides theoretical grounding for their superior performance, revealing the crucial role of gatiโฆ
OpenDlign: Open-World Point Cloud Understanding with Depth-Aligned Images
ยท2441 wordsยท12 minsยท
loading
ยท
loading
Computer Vision
3D Vision
๐ข Imperial College London
OpenDlign uses novel depth-aligned images from a diffusion model to boost open-world 3D understanding, achieving significant performance gains on diverse benchmarks.
Noether's Razor: Learning Conserved Quantities
ยท2052 wordsยท10 minsยท
loading
ยท
loading
AI Generated
Machine Learning
Deep Learning
๐ข Imperial College London
Noetherโs Razor learns conserved quantities and symmetries directly from data via Bayesian model selection, improving dynamical systems modeling accuracy and generalizability.
Measuring Goal-Directedness
ยท1615 wordsยท8 minsยท
loading
ยท
loading
AI Theory
Ethics
๐ข Imperial College London
New metric, Maximum Entropy Goal-Directedness (MEG), quantifies AI goal-directedness, crucial for assessing AI safety and agency.
Identifiable Object-Centric Representation Learning via Probabilistic Slot Attention
ยท2355 wordsยท12 minsยท
loading
ยท
loading
Machine Learning
Representation Learning
๐ข Imperial College London
Probabilistic Slot Attention achieves identifiable object-centric representations without supervision, advancing systematic generalization in machine learning.
ID-to-3D: Expressive ID-guided 3D Heads via Score Distillation Sampling
ยท1848 wordsยท9 minsยท
loading
ยท
loading
Computer Vision
3D Vision
๐ข Imperial College London
ID-to-3D: Generate expressive, identity-consistent 3D human heads from just a few in-the-wild images using score distillation sampling and 2D diffusion models.
Feedback control guides credit assignment in recurrent neural networks
ยท1962 wordsยท10 minsยท
loading
ยท
loading
AI Theory
Optimization
๐ข Imperial College London
Brain-inspired recurrent neural networks learn efficiently by using feedback control to approximate optimal gradients, enabling rapid movement corrections and efficient adaptation to persistent errorsโฆ
Entrywise error bounds for low-rank approximations of kernel matrices
ยท1461 wordsยท7 minsยท
loading
ยท
loading
AI Theory
Optimization
๐ข Imperial College London
This paper provides novel entrywise error bounds for low-rank kernel matrix approximations, showing how many data points are needed to get statistically consistent results for low-rank approximations.
Energy-Based Modelling for Discrete and Mixed Data via Heat Equations on Structured Spaces
ยท2907 wordsยท14 minsยท
loading
ยท
loading
Machine Learning
Deep Learning
๐ข Imperial College London
Train discrete EBMs efficiently with Energy Discrepancy, a novel loss function that eliminates the need for Markov Chain Monte Carlo, using diffusion processes on structured spaces.
Absorb & Escape: Overcoming Single Model Limitations in Generating Heterogeneous Genomic Sequences
ยท3759 wordsยท18 minsยท
loading
ยท
loading
Machine Learning
Deep Learning
๐ข Imperial College London
Absorb & Escape: a novel post-training sampling method that overcomes single model limitations by combining Autoregressive (AR) and Diffusion Models (DMs), generating high-quality heterogeneous genomiโฆ