Representation Learning
Segment, Shuffle, and Stitch: A Simple Layer for Improving Time-Series Representations
·3043 words·15 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Queen's University
Boost time-series model accuracy with Segment, Shuffle, and Stitch (S3)! This simple layer shuffles data segments to enhance representation learning, improving classification, forecasting, and anomaly…
Schur Nets: exploiting local structure for equivariance in higher order graph neural networks
·1825 words·9 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 University of Chicago
Schur Nets boost higher-order GNNs by efficiently exploiting local graph structure for automorphism equivariance, achieving improved performance without the computational burden of traditional methods…
Sample Complexity of Interventional Causal Representation Learning
·449 words·3 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Carnegie Mellon University
First finite-sample analysis of interventional causal representation learning shows that surprisingly few samples suffice for accurate graph and latent variable recovery.
Revisiting K-mer Profile for Effective and Scalable Genome Representation Learning
·1651 words·8 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Aalborg University
This paper proposes a lightweight and scalable k-mer based model for metagenomic binning, achieving comparable performance to computationally expensive genome foundation models while significantly imp…
Rethinking Reconstruction-based Graph-Level Anomaly Detection: Limitations and a Simple Remedy
·2177 words·11 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Korea Advanced Institute of Science and Technology (KAIST)
MUSE, a novel graph anomaly detection method, leverages multifaceted summaries of reconstruction errors, achieving state-of-the-art performance by addressing limitations of existing Graph-AE-based met…
Random Representations Outperform Online Continually Learned Representations
·1894 words·9 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 University of Oxford
Random pixel projections outperform complex online continual learning methods for image classification, challenging assumptions about representation learning.
Pure Message Passing Can Estimate Common Neighbor for Link Prediction
·2519 words·12 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Computer Science and Engineering, University of Notre Dame
Pure message passing in graph neural networks can accurately estimate common neighbor heuristics for superior link prediction.
Provably Optimal Memory Capacity for Modern Hopfield Models: Transformer-Compatible Dense Associative Memories as Spherical Codes
·1812 words·9 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Northwestern University
Researchers achieve provably optimal memory capacity in transformer-compatible Hopfield models by framing the problem as an optimal spherical code arrangement, resulting in a novel sublinear time algo…
ProtGO: Function-Guided Protein Modeling for Unified Representation Learning
·1875 words·9 mins·
loading
·
loading
AI Generated
Natural Language Processing
Representation Learning
🏢 Westlake University
ProtGO: A novel unified framework integrating protein sequence, structure & function for superior representation learning, significantly outperforming current methods.
Preventing Model Collapse in Deep Canonical Correlation Analysis by Noise Regularization
·2437 words·12 mins·
loading
·
loading
Multimodal Learning
Representation Learning
🏢 Hong Kong Polytechnic University
Noise Regularization rescues Deep Canonical Correlation Analysis from model collapse!
Poseidon: Efficient Foundation Models for PDEs
·9448 words·45 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 ETH Zurich
POSEIDON: a novel foundation model for PDEs achieves significant gains in accuracy and sample efficiency, generalizing well to unseen physics.
PLIP: Language-Image Pre-training for Person Representation Learning
·3449 words·17 mins·
loading
·
loading
Computer Vision
Representation Learning
🏢 National Key Laboratory of Multispectral Information Intelligent Processing Technology, School of Artificial Intelligence and Automation, Huazhong University of Science and Technology
PLIP: Novel language-image pre-training framework excels at person representation learning, surpassing existing methods on various downstream tasks thanks to its three pretext tasks and large-scale SY…
On the Role of Attention Masks and LayerNorm in Transformers
·2522 words·12 mins·
loading
·
loading
AI Generated
AI Theory
Representation Learning
🏢 MIT
Transformers’ self-attention mechanism, while powerful, suffers from rank collapse with increasing depth. This paper reveals that while masked attention still leads to exponential collapse, sparse att…
On the Impact of Feature Heterophily on Link Prediction with Graph Neural Networks
·2063 words·10 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 University of Michigan
Graph Neural Networks (GNNs) struggle with heterophilic link prediction; this paper introduces formal definitions, theoretical analysis, improved designs, and real-world benchmarks to address this cha…
On Affine Homotopy between Language Encoders
·2070 words·10 mins·
loading
·
loading
AI Generated
Natural Language Processing
Representation Learning
🏢 ETH Zurich
This paper introduces a novel notion of intrinsic similarity between language encoders, based on affine homotopy, and demonstrates its strong correlation with extrinsic similarity (downstream task per…
Not so griddy: Internal representations of RNNs path integrating more than one agent
·2491 words·12 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Johns Hopkins Applied Physics Laboratory
RNNs trained on dual-agent path integration develop distinct internal representations compared to single-agent models, exhibiting weaker grid cell responses and enhanced border/band cell activity, wit…
Non-Euclidean Mixture Model for Social Network Embedding
·2185 words·11 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 UC Los Angeles
Non-Euclidean Mixture Model (NMM-GNN) outperforms existing methods by using spherical and hyperbolic spaces to model homophily and social influence in social network embedding, improving link predicti…
Neural Persistence Dynamics
·2242 words·11 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 University of Salzburg
Neural Persistence Dynamics learns collective behavior from topological features, accurately predicting parameters of governing equations without tracking individual entities.
Mutual Information Estimation via Normalizing Flows
·2080 words·10 mins·
loading
·
loading
AI Theory
Representation Learning
🏢 Skoltech
Researchers introduce a novel approach to mutual information (MI) estimation using normalizing flows, providing accurate estimates even in high dimensions.
Multi-Scale Representation Learning for Protein Fitness Prediction
·1447 words·7 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Mila - Québec AI Institute
S3F: a novel multi-scale model achieves state-of-the-art protein fitness prediction by integrating protein sequence, structure, and surface features.