Unsupervised Learning
Unsupervised Discovery of Formulas for Mathematical Constants
·4062 words·20 mins·
loading
·
loading
AI Generated
Machine Learning
Unsupervised Learning
π’ Technion - Israel Institute of Technology
AI automates mathematical constant formula discovery by analyzing convergence dynamics, revealing known and novel formulas for Ο, ln(2), and other constants.
Unsupervised Anomaly Detection in The Presence of Missing Values
·3139 words·15 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Chinese University of Hong Kong, Shenzhen, China
ImAD: An end-to-end unsupervised anomaly detection method conquering missing data’s challenge by integrating imputation and detection in a unified framework, achieving superior accuracy!
The tree autoencoder model, with application to hierarchical data visualization
·2243 words·11 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Dept. of Computer Science and Engineering, University of California, Merced
PCA tree: a novel hierarchical dimensionality reduction model visualized using oblique trees and local PCAs, offering speed and interpretability.
The Star Geometry of Critic-Based Regularizer Learning
·1709 words·9 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ University of California, Los Angeles
Star geometry reveals optimal data-driven regularizers!
The Map Equation Goes Neural: Mapping Network Flows with Graph Neural Networks
·3312 words·16 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ University of Zurich
Neuromap leverages graph neural networks to optimize the map equation for community detection, achieving competitive performance and automatically determining the optimal number of clusters.
Robust Contrastive Multi-view Clustering against Dual Noisy Correspondence
·2413 words·12 mins·
loading
·
loading
AI Generated
Machine Learning
Unsupervised Learning
π’ College of Computer Science, Sichuan University, China
CANDY refines contrastive multi-view clustering by cleverly using inter-view similarities to identify and correct false negatives and a spectral method to remove false positives, resulting in signific…
Rethinking the Diffusion Models for Missing Data Imputation: A Gradient Flow Perspective
·3317 words·16 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Zhejiang University
NewImp boosts diffusion models’ missing data imputation by curbing sample diversity and eliminating data masking, achieving superior accuracy.
Protein-Nucleic Acid Complex Modeling with Frame Averaging Transformer
·1968 words·10 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Carnegie Mellon University
Unsupervised learning predicts protein-nucleic acid binding using contact map prediction, significantly improving aptamer screening via FAFormer, a novel equivariant transformer.
Out-of-Distribution Detection with a Single Unconditional Diffusion Model
·2009 words·10 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Department of Computer Science, National University of Singapore
Single diffusion model achieves competitive out-of-distribution detection across diverse tasks by analyzing diffusion path characteristics.
Oja's Algorithm for Streaming Sparse PCA
·382 words·2 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ University of Texas at Austin
Oja’s algorithm achieves minimax optimal error rates for streaming sparse PCA using a simple single-pass thresholding method, requiring only O(d) space and O(nd) time.
Nonlinear dynamics of localization in neural receptive fields
·1762 words·9 mins·
loading
·
loading
Unsupervised Learning
π’ Yale University
Neural receptive fields’ localization emerges from nonlinear learning dynamics driven by naturalistic data’s higher-order statistics, not just sparsity.
Near-Optimality of Contrastive Divergence Algorithms
·280 words·2 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Gatsby Computational Neuroscience Unit, University College London
Contrastive Divergence algorithms achieve near-optimal parameter estimation rates, matching the CramΓ©r-Rao lower bound under specific conditions, as proven by a novel non-asymptotic analysis.
Multidimensional Fractional Programming for Normalized Cuts
·1661 words·8 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ School of Science and Engineering, the Chinese University of Hong Kong (Shenzhen)
Multidimensional Fractional Programming (MFP) efficiently solves the challenging Normalized Cut (NCut) problem for multi-class clustering, outperforming existing methods.
Learning to Embed Distributions via Maximum Kernel Entropy
·1819 words·9 mins·
loading
·
loading
AI Generated
Machine Learning
Unsupervised Learning
π’ Dipartimento Di Matematica, Universit
Gli Studi Di Genova
Learn optimal data-dependent distribution kernels via Maximum Kernel Entropy, eliminating manual kernel selection and boosting performance on various downstream tasks.
Learning Diffusion Priors from Observations by Expectation Maximization
·3368 words·16 mins·
loading
·
loading
AI Generated
Machine Learning
Unsupervised Learning
π’ University of LiΓ¨ge
This research introduces an Expectation-Maximization algorithm to train diffusion models from incomplete and noisy data, enabling their use in data-scarce scientific applications.
LaSCal: Label-Shift Calibration without target labels
·3140 words·15 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ ESAT-PSI, KU Leuven
LaSCal, a novel label-free calibration method, ensures reliable model predictions under label shift by using a consistent calibration error estimator, achieving effective and robust unsupervised calib…
Interactive Deep Clustering via Value Mining
·1729 words·9 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Sichuan University
Interactive Deep Clustering (IDC) significantly boosts deep clustering performance by strategically incorporating minimal user interaction to resolve ambiguous sample classifications.
Interaction-Force Transport Gradient Flows
·1588 words·8 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Humboldt University of Berlin
New gradient flow geometry improves MMD-based sampling by teleporting particle mass, guaranteeing global exponential convergence, and yielding superior empirical results.
Fair Kernel K-Means: from Single Kernel to Multiple Kernel
·1853 words·9 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Anhui University
Fair Kernel K-Means (FKKM) framework ensures fair data partitioning by integrating a novel fairness regularization term into the kernel k-means algorithm, extending this to multiple kernel settings fo…
Expected Probabilistic Hierarchies
·4277 words·21 mins·
loading
·
loading
Machine Learning
Unsupervised Learning
π’ Munich Data Science Institute
Expected Probabilistic Hierarchies (EPH) offers a novel, scalable approach to hierarchical clustering by optimizing expected scores under a probabilistic model, outperforming existing methods on vario…