Skip to main content

Self-Supervised Learning

Addressing Spatial-Temporal Heterogeneity: General Mixed Time Series Analysis via Latent Continuity Recovery and Alignment
·4775 words·23 mins· loading · loading
AI Generated Machine Learning Self-Supervised Learning 🏢 College of Control Science and Engineering, Zhejiang University, China
MiTSformer, a novel framework, recovers latent continuous variables from discrete data to enable complete spatial-temporal modeling of mixed time series, achieving state-of-the-art performance on mult…
Accelerating Augmentation Invariance Pretraining
·1854 words·9 mins· loading · loading
Computer Vision Self-Supervised Learning 🏢 University of Wisconsin-Madison
Boost Vision Transformer pretraining speed by 4x with novel sequence compression techniques!
Abstracted Shapes as Tokens - A Generalizable and Interpretable Model for Time-series Classification
·3172 words·15 mins· loading · loading
AI Generated Machine Learning Self-Supervised Learning 🏢 Rensselaer Polytechnic Institute
VQShape: a pre-trained model uses abstracted shapes as interpretable tokens for generalizable time-series classification, achieving comparable performance to black-box models and excelling in zero-sho…
A probability contrastive learning framework for 3D molecular representation learning
·2012 words·10 mins· loading · loading
Machine Learning Self-Supervised Learning 🏢 University at Buffalo
A novel probability-based contrastive learning framework significantly improves 3D molecular representation learning by mitigating false pairs, achieving state-of-the-art results.