Skip to main content

🏢 Imperial College London

UV-free Texture Generation with Denoising and Geodesic Heat Diffusion
·2448 words·12 mins· loading · loading
Computer Vision 3D Vision 🏢 Imperial College London
UV3-TeD generates high-quality 3D textures directly on object surfaces using a novel diffusion probabilistic model, eliminating UV-mapping limitations.
Universal Sample Coding
·2065 words·10 mins· loading · loading
AI Generated Machine Learning Federated Learning 🏢 Imperial College London
Universal Sample Coding revolutionizes data transmission by reducing bits needed to communicate multiple samples from an unknown distribution, achieving significant improvements in federated learning …
Transition Constrained Bayesian Optimization via Markov Decision Processes
·2420 words·12 mins· loading · loading
Machine Learning Reinforcement Learning 🏢 Imperial College London
This paper presents a novel BayesOpt framework that incorporates Markov Decision Processes to optimize black-box functions with transition constraints, overcoming limitations of traditional methods.
Towards Universal Mesh Movement Networks
·2599 words·13 mins· loading · loading
🏢 Imperial College London
Universal Mesh Movement Network (UM2N) revolutionizes mesh movement for PDE solvers, enabling zero-shot adaptation to diverse problems and significantly accelerating simulations with improved accuracy…
Theoretical Foundations of Deep Selective State-Space Models
·379 words·2 mins· loading · loading
AI Theory Generalization 🏢 Imperial College London
Deep learning’s sequence modeling is revolutionized by selective state-space models (SSMs)! This paper provides theoretical grounding for their superior performance, revealing the crucial role of gati…
OpenDlign: Open-World Point Cloud Understanding with Depth-Aligned Images
·2441 words·12 mins· loading · loading
Computer Vision 3D Vision 🏢 Imperial College London
OpenDlign uses novel depth-aligned images from a diffusion model to boost open-world 3D understanding, achieving significant performance gains on diverse benchmarks.
Noether's Razor: Learning Conserved Quantities
·2052 words·10 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏢 Imperial College London
Noether’s Razor learns conserved quantities and symmetries directly from data via Bayesian model selection, improving dynamical systems modeling accuracy and generalizability.
Measuring Goal-Directedness
·1615 words·8 mins· loading · loading
AI Theory Ethics 🏢 Imperial College London
New metric, Maximum Entropy Goal-Directedness (MEG), quantifies AI goal-directedness, crucial for assessing AI safety and agency.
Identifiable Object-Centric Representation Learning via Probabilistic Slot Attention
·2355 words·12 mins· loading · loading
Machine Learning Representation Learning 🏢 Imperial College London
Probabilistic Slot Attention achieves identifiable object-centric representations without supervision, advancing systematic generalization in machine learning.
ID-to-3D: Expressive ID-guided 3D Heads via Score Distillation Sampling
·1848 words·9 mins· loading · loading
Computer Vision 3D Vision 🏢 Imperial College London
ID-to-3D: Generate expressive, identity-consistent 3D human heads from just a few in-the-wild images using score distillation sampling and 2D diffusion models.
Feedback control guides credit assignment in recurrent neural networks
·1962 words·10 mins· loading · loading
AI Theory Optimization 🏢 Imperial College London
Brain-inspired recurrent neural networks learn efficiently by using feedback control to approximate optimal gradients, enabling rapid movement corrections and efficient adaptation to persistent errors…
Entrywise error bounds for low-rank approximations of kernel matrices
·1461 words·7 mins· loading · loading
AI Theory Optimization 🏢 Imperial College London
This paper provides novel entrywise error bounds for low-rank kernel matrix approximations, showing how many data points are needed to get statistically consistent results for low-rank approximations.
Energy-Based Modelling for Discrete and Mixed Data via Heat Equations on Structured Spaces
·2907 words·14 mins· loading · loading
Machine Learning Deep Learning 🏢 Imperial College London
Train discrete EBMs efficiently with Energy Discrepancy, a novel loss function that eliminates the need for Markov Chain Monte Carlo, using diffusion processes on structured spaces.
Absorb & Escape: Overcoming Single Model Limitations in Generating Heterogeneous Genomic Sequences
·3759 words·18 mins· loading · loading
Machine Learning Deep Learning 🏢 Imperial College London
Absorb & Escape: a novel post-training sampling method that overcomes single model limitations by combining Autoregressive (AR) and Diffusion Models (DMs), generating high-quality heterogeneous genomi…