Self-Supervised Learning
Localizing Memorization in SSL Vision Encoders
·4999 words·24 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 CISPA, Helmholtz Center for Information Security
SSL vision encoders, while trained on massive datasets, surprisingly memorize individual data points. This paper introduces novel methods to precisely pinpoint this memorization within encoders at bot…
Learning predictable and robust neural representations by straightening image sequences
·2413 words·12 mins·
loading
·
loading
AI Generated
Machine Learning
Self-Supervised Learning
🏢 Center for Neural Science, New York University
Self-supervised learning gets a boost: New objective function trains robust & predictive neural networks by straightening video trajectories, surpassing invariance methods for better spatiotemporal re…
Large Pre-trained time series models for cross-domain Time series analysis tasks
·1870 words·9 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 Georgia Institute of Technology
Large Pre-trained Time-series Models (LPTM) achieves superior forecasting and time-series classification results using a novel adaptive segmentation method, requiring up to 40% less data and 50% less …
In-Context Symmetries: Self-Supervised Learning through Contextual World Models
·3570 words·17 mins·
loading
·
loading
Computer Vision
Self-Supervised Learning
🏢 MIT CSAIL
CONTEXTSSL: A novel self-supervised learning algorithm that adapts to task-specific symmetries by using context, achieving significant performance gains over existing methods.
Identify Then Recommend: Towards Unsupervised Group Recommendation
·1520 words·8 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 Ant Group
Unsupervised group recommendation model, ITR, achieves superior user and group recommendation accuracy by dynamically identifying user groups and employing self-supervised learning, eliminating the ne…
Harnessing small projectors and multiple views for efficient vision pretraining
·2903 words·14 mins·
loading
·
loading
Computer Vision
Self-Supervised Learning
🏢 Mila - Quebec AI Institute & Computer Science, McGill University
Boost self-supervised visual learning: This paper introduces theoretical insights and practical recommendations to significantly improve SSL’s efficiency and reduce data needs.
FUG: Feature-Universal Graph Contrastive Pre-training for Graphs with Diverse Node Features
·2145 words·11 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 Tianjin University
FUG: A new graph contrastive pre-training strategy solves GNN transferability issues across datasets with diverse node features, achieving comparable performance to retraining while significantly impr…
Flexible mapping of abstract domains by grid cells via self-supervised extraction and projection of generalized velocity signals
·2121 words·10 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 MIT
Brain’s flexible mapping of abstract domains is achieved via self-supervised extraction and projection of generalized velocity signals by grid cells, enabling efficient map generation.
Exploring Molecular Pretraining Model at Scale
·2151 words·11 mins·
loading
·
loading
AI Generated
Machine Learning
Self-Supervised Learning
🏢 Peking University
Uni-Mol2, a groundbreaking 1.1B parameter molecular pretraining model, reveals power-law scaling in molecular representation learning, achieving significant performance improvements on downstream task…
Exploiting Representation Curvature for Boundary Detection in Time Series
·2189 words·11 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 KAIST
RECURVE: A novel boundary detection method leveraging representation trajectory curvature, surpassing state-of-the-art techniques by accommodating both gradual and abrupt changes in time series.
Exploitation of a Latent Mechanism in Graph Contrastive Learning: Representation Scattering
·1847 words·9 mins·
loading
·
loading
Self-Supervised Learning
🏢 Tianjin University
SGRL, a novel graph contrastive learning framework, significantly boosts performance by leveraging the inherent ‘representation scattering’ mechanism and integrating graph topology, outperforming exis…
Efficient Availability Attacks against Supervised and Contrastive Learning Simultaneously
·3327 words·16 mins·
loading
·
loading
AI Generated
Machine Learning
Self-Supervised Learning
🏢 Academy of Mathematics and Systems Science, Chinese Academy of Sciences
New attacks foil both supervised and contrastive learning, achieving state-of-the-art unlearnability with less computation.
Efficiency for Free: Ideal Data Are Transportable Representations
·4111 words·20 mins·
loading
·
loading
AI Generated
Machine Learning
Self-Supervised Learning
🏢 Westlake University
RELA accelerates representation learning by leveraging freely available pre-trained models to generate efficient data, reducing computational costs by up to 50% while maintaining accuracy.
EEGPT: Pretrained Transformer for Universal and Reliable Representation of EEG Signals
·3265 words·16 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 Harbin Institute of Technology
EEGPT: A pretrained transformer model revolutionizes EEG signal representation by using a dual self-supervised learning method, achieving state-of-the-art results across various tasks.
Data-Efficient Operator Learning via Unsupervised Pretraining and In-Context Learning
·4184 words·20 mins·
loading
·
loading
AI Generated
Machine Learning
Self-Supervised Learning
🏢 Simon Fraser University
Data-efficient neural operator learning is achieved via unsupervised pretraining and in-context learning, significantly reducing simulation costs and improving generalization.
Contrastive-Equivariant Self-Supervised Learning Improves Alignment with Primate Visual Area IT
·2007 words·10 mins·
loading
·
loading
Computer Vision
Self-Supervised Learning
🏢 Center for Neural Science, New York University
Self-supervised learning models can now better predict primate IT neural responses by preserving structured variability to input transformations, improving alignment with biological visual perception.
Connecting Joint-Embedding Predictive Architecture with Contrastive Self-supervised Learning
·2598 words·13 mins·
loading
·
loading
Self-Supervised Learning
🏢 Carnegie Mellon University
C-JEPA boosts self-supervised visual learning by integrating contrastive learning with a joint-embedding predictive architecture, enhancing stability and representation quality.
Cell ontology guided transcriptome foundation model
·4051 words·20 mins·
loading
·
loading
Self-Supervised Learning
🏢 University of Toronto
scCello: A Cell Ontology-Guided Transcriptome Foundation Model improves single-cell RNA sequencing analysis by incorporating cell lineage information, significantly boosting accuracy and generalizabil…
Causal Contrastive Learning for Counterfactual Regression Over Time
·3424 words·17 mins·
loading
·
loading
Machine Learning
Self-Supervised Learning
🏢 Paris-Saclay University
Causal CPC: a novel method for accurate and efficient counterfactual regression over time using RNNs, CPC, and InfoMax, achieving state-of-the-art performance.
Brain-JEPA: Brain Dynamics Foundation Model with Gradient Positioning and Spatiotemporal Masking
·2427 words·12 mins·
loading
·
loading
Self-Supervised Learning
🏢 National University of Singapore
Brain-JEPA: a novel brain dynamics foundation model leverages fMRI data via innovative gradient positioning and spatiotemporal masking to achieve state-of-the-art performance in diverse brain activity…