Skip to main content

Deep Learning

Post-Hoc Reversal: Are We Selecting Models Prematurely?
·2661 words·13 mins· loading · loading
Machine Learning Deep Learning 🏒 Stanford University
Post-hoc model transformations can reverse performance trends, prompting a reevaluation of model selection strategies and suggesting a new ‘post-hoc selection’ method for improved model development.
Physics-Informed Variational State-Space Gaussian Processes
·1537 words·8 mins· loading · loading
Machine Learning Deep Learning 🏒 University of Warwick
PHYSS-GP: a novel physics-informed state-space Gaussian process model for efficient spatio-temporal data modeling, outperforming existing methods in predictive accuracy and computational speed.
Physics-Informed Regularization for Domain-Agnostic Dynamical System Modeling
·1913 words·9 mins· loading · loading
Machine Learning Deep Learning 🏒 UC Los Angeles
TREAT: a novel framework boosting dynamical systems modeling accuracy by enforcing Time-Reversal Symmetry (TRS) via a regularization term. High-precision modeling is achieved across diverse systems, …
Physical Consistency Bridges Heterogeneous Data in Molecular Multi-Task Learning
·3395 words·16 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 Microsoft Research
Physically consistent multi-task learning bridges heterogeneous molecular data by directly leveraging physical laws to improve predictions, enhancing accuracy beyond the limitations of individual data…
PGN: The RNN's New Successor is Effective for Long-Range Time Series Forecasting
·3392 words·16 mins· loading · loading
Machine Learning Deep Learning 🏒 Beijing Jiaotong University
TPGN, a novel framework for long-range time series forecasting, uses Parallel Gated Networks (PGN) to efficiently capture long-term dependencies, achieving state-of-the-art results on multiple dataset…
Persistent Test-time Adaptation in Recurring Testing Scenarios
·5361 words·26 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 University of Illinois at Urbana-Champaign
Persistent Test-Time Adaptation (PeTTA) prevents AI model collapse in recurring scenarios by dynamically adjusting the adaptation strategy based on divergence from the initial model, ensuring long-ter…
pcaGAN: Improving Posterior-Sampling cGANs via Principal Component Regularization
·2599 words·13 mins· loading · loading
Machine Learning Deep Learning 🏒 Ohio State University
pcaGAN boosts posterior-sampling cGANs by using principal component regularization, achieving faster, more accurate results in various imaging tasks.
Parametric model reduction of mean-field and stochastic systems via higher-order action matching
·2431 words·12 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 New York University
HOAM learns reduced models of population dynamics for complex systems, enabling fast predictions across various physics parameters, outperforming state-of-the-art techniques.
PageRank Bandits for Link Prediction
·2009 words·10 mins· loading · loading
Machine Learning Deep Learning 🏒 University of Illinois Urbana-Champaign
PageRank Bandits (PRB) revolutionizes link prediction by framing it as a sequential decision-making problem, thus enabling the system to adapt to evolving data. Combining contextual bandits with PageR…
PACE: Pacing Operator Learning to Accurate Optical Field Simulation for Complicated Photonic Devices
·2611 words·13 mins· loading · loading
Machine Learning Deep Learning 🏒 University of Texas at Austin
PACE, a novel neural operator, achieves unprecedented accuracy and speed in optical field simulation for complex photonic devices, surpassing existing methods by significantly reducing errors and boos…
P$^2$C$^2$Net: PDE-Preserved Coarse Correction Network for efficient prediction of spatiotemporal dynamics
·3055 words·15 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 Gaoling School of Artificial Intelligence, Renmin University of China
P2C2Net: A physics-encoded neural network efficiently predicts complex spatiotemporal dynamics using coarse grids and limited training data, achieving state-of-the-art results.
Out-Of-Distribution Detection with Diversification (Provably)
·1846 words·9 mins· loading · loading
Machine Learning Deep Learning 🏒 College of Intelligence and Computing, Tianjin University
Boost OOD detection accuracy with diverseMix: a novel method enhancing auxiliary outlier diversity, provably improving generalization and achieving state-of-the-art results.
Ordered Momentum for Asynchronous SGD
·2380 words·12 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 National Key Laboratory for Novel Software Technology, School of Computer Science, Nanjing University
Ordered Momentum (OrMo) significantly boosts asynchronous stochastic gradient descent (ASGD) convergence by cleverly incorporating momentum, resolving prior convergence issues. This novel approach is…
Optimal Rates for Vector-Valued Spectral Regularization Learning Algorithms
·231 words·2 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 Gatsby Computational Neuroscience Unit
Vector-valued spectral learning algorithms finally get rigorous theoretical backing, showing optimal learning rates and resolving the saturation effect puzzle.
Open-Book Neural Algorithmic Reasoning
·1944 words·10 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 East China Normal University
This paper introduces open-book neural algorithmic reasoning, a novel framework that significantly enhances neural reasoning capabilities by allowing networks to access and utilize all training instan…
Online Relational Inference for Evolving Multi-agent Interacting Systems
·2683 words·13 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏒 Georgia Institute of Technology
ORI: a novel online relational inference framework efficiently identifies hidden interaction graphs in evolving multi-agent systems using streaming data and real-time adaptation.
On the Target-kernel Alignment: a Unified Analysis with Kernel Complexity
·2457 words·12 mins· loading · loading
Machine Learning Deep Learning 🏒 School of Statistics and Management, Shanghai University of Finance and Economics
Truncated kernel methods consistently outperform standard methods by eliminating the saturation effect, offering faster learning rates and enhanced theoretical guarantees.
On the Scalability of GNNs for Molecular Graphs
·2680 words·13 mins· loading · loading
Machine Learning Deep Learning 🏒 Valence Labs
Giant leap in molecular GNNs! MolGPS, a new foundation model, achieves state-of-the-art performance on molecular property prediction by leveraging massive datasets and demonstrating the scalability o…
On the Scalability of Certified Adversarial Robustness with Generated Data
·2448 words·12 mins· loading · loading
Machine Learning Deep Learning 🏒 Machine Learning and Data Analytics Lab, FAU Erlangen Nürnberg, Germany
Boosting certified robustness of machine learning models by 3-4% using generated data from diffusion models!
On the Limitations of Fractal Dimension as a Measure of Generalization
·1917 words·9 mins· loading · loading
Machine Learning Deep Learning 🏒 University of Oxford
Fractal dimension, while showing promise, fails to consistently predict neural network generalization due to hyperparameter influence and adversarial initializations; prompting further research.