Representation Learning
Zipfian Whitening
·2044 words·10 mins·
loading
·
loading
Natural Language Processing
Representation Learning
🏢 Tohoku University
Zipfian Whitening: Weighting PCA whitening by word frequency dramatically improves NLP task performance, surpassing established baselines and providing a theoretical framework for existing methods.
When is an Embedding Model More Promising than Another?
·4115 words·20 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Mila - Quebec AI Institute
This paper introduces a novel, task-agnostic method for ranking embedding models using information sufficiency, a concept derived from communication theory and statistical experiments comparison, demo…
When does perceptual alignment benefit vision representations?
·4058 words·20 mins·
loading
·
loading
AI Generated
Computer Vision
Representation Learning
🏢 MIT
Aligning vision models to human perceptual similarity judgments significantly boosts performance in diverse vision tasks like counting and segmentation, but surprisingly reduces performance in natural…
What Variables Affect Out-of-Distribution Generalization in Pretrained Models?
·4187 words·20 mins·
loading
·
loading
Computer Vision
Representation Learning
🏢 Rochester Institute of Technology
High-resolution datasets with diverse classes significantly improve the transferability of pretrained DNNs by reducing representation compression and mitigating the ’tunnel effect.'
What Is Missing For Graph Homophily? Disentangling Graph Homophily For Graph Neural Networks
·2555 words·12 mins·
loading
·
loading
AI Generated
AI Theory
Representation Learning
🏢 Nanyang Technological University
Tri-Hom disentangles graph homophily into label, structural, and feature aspects, providing a more comprehensive and accurate metric for predicting GNN performance.
What do Graph Neural Networks learn? Insights from Tropical Geometry
·1465 words·7 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 University of Edinburgh
Using tropical geometry, researchers reveal that ReLU-activated message-passing GNNs learn continuous piecewise linear functions, highlighting their expressivity limits and paving the way for enhanced…
Weisfeiler and Leman Go Loopy: A New Hierarchy for Graph Representational Learning
·3118 words·15 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Munich Center for Machine Learning
This paper introduces r-lWL, a new graph isomorphism test hierarchy that surpasses the limitations of the Weisfeiler-Leman test by counting cycles up to length r+2, and its GNN counterpart, r-lMPNN, w…
Wasserstein convergence of Cech persistence diagrams for samplings of submanifolds
·1477 words·7 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Université Paris-Saclay, Inria
This paper proves that Čech persistence diagrams converge to the true underlying shape precisely when using Wasserstein distances with p > m, where m is the submanifold dimension, significantly advanc…
Transferring disentangled representations: bridging the gap between synthetic and real images
·3866 words·19 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Università Degli Studi Di Genova
This paper bridges the gap between synthetic and real image disentanglement by proposing a novel transfer learning approach. The method leverages weakly supervised learning on synthetic data to train…
Trading Place for Space: Increasing Location Resolution Reduces Contextual Capacity in Hippocampal Codes
·1950 words·10 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 University of Pennsylvania
Boosting hippocampal spatial resolution surprisingly shrinks its contextual memory capacity, revealing a crucial trade-off between precision and context storage.
Towards Stable Representations for Protein Interface Prediction
·2364 words·12 mins·
loading
·
loading
AI Generated
Machine Learning
Representation Learning
🏢 Hong Kong University of Science and Technology
ATProt: Adversarial training makes protein interface prediction robust to flexibility!
The motion planning neural circuit in goal-directed navigation as Lie group operator search
·1385 words·7 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 UT Southwestern Medical Center
Neural circuits for goal-directed navigation are modeled as Lie group operator searches, implemented by a two-layer feedforward circuit mimicking Drosophila’s navigation system.
Test-time Adaptation in Non-stationary Environments via Adaptive Representation Alignment
·2451 words·12 mins·
loading
·
loading
AI Generated
Machine Learning
Representation Learning
🏢 Stanford University
Ada-ReAlign: a novel algorithm for continual test-time adaptation that leverages non-stationary representation learning to effectively align unlabeled data streams with source data, enhancing model ad…
Temporal Graph Neural Tangent Kernel with Graphon-Guaranteed
·1777 words·9 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Meta AI
Temp-G³NTK: a novel temporal graph neural tangent kernel guarantees convergence to graphon NTK, offering superior performance in temporal graph classification and node-level tasks.
SubgDiff: A Subgraph Diffusion Model to Improve Molecular Representation Learning
·2691 words·13 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Hong Kong University of Science and Technology
SubgDiff enhances molecular representation learning by incorporating substructural information into a diffusion model framework, achieving superior performance in molecular force predictions.
Structured flexibility in recurrent neural networks via neuromodulation
·1567 words·8 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Stanford University
Neuromodulated RNNs (NM-RNNs) enhance RNN flexibility by dynamically scaling recurrent weights using a neuromodulatory subnetwork, achieving higher accuracy and generalizability on various tasks compa…
Spectral Graph Pruning Against Over-Squashing and Over-Smoothing
·4594 words·22 mins·
loading
·
loading
AI Generated
AI Theory
Representation Learning
🏢 Universität Des Saarlandes
Spectral graph pruning simultaneously mitigates over-squashing and over-smoothing in GNNs via edge deletion, improving generalization.
Soft Tensor Product Representations for Fully Continuous, Compositional Visual Representations
·8738 words·42 mins·
loading
·
loading
Computer Vision
Representation Learning
🏢 UNSW, Sydney
Soft Tensor Product Representations (Soft TPRs) revolutionize compositional visual representation learning by seamlessly blending continuous vector spaces and compositional structures, leading to supe…
Shape analysis for time series
·2156 words·11 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Université Paris-Saclay
TS-LDDMM: Unsupervised time-series analysis handles irregular data, offering interpretable shape-based representations & exceeding existing methods in benchmarks.
Sequential Signal Mixing Aggregation for Message Passing Graph Neural Networks
·2468 words·12 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Technion
Sequential Signal Mixing Aggregation (SSMA) boosts message-passing graph neural network performance by effectively mixing neighbor features, achieving state-of-the-art results across various benchmark…