Deep Learning
Decomposable Transformer Point Processes
·2120 words·10 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 University of Cambridge
Decomposable Transformer Point Processes (DTPP) dramatically accelerates marked point process inference by using a mixture of log-normals for inter-event times and Transformers for marks, outperformin…
DDN: Dual-domain Dynamic Normalization for Non-stationary Time Series Forecasting
·2680 words·13 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Tsinghua University
DDN: Dual-domain Dynamic Normalization dynamically improves time series forecasting accuracy by addressing data distribution changes in both time and frequency domains via a plug-in module.
Data Free Backdoor Attacks
·2377 words·12 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 the Pennsylvania State University
Data-Free Backdoor Attacks (DFBA) injects undetectable backdoors into pre-trained classifiers without retraining or architectural changes, bypassing existing defenses.
DASH: Warm-Starting Neural Network Training in Stationary Settings without Loss of Plasticity
·4111 words·20 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Graduate School of AI, KAIST
DASH combats neural network training’s plasticity loss during warm-starting by selectively forgetting memorized noise while preserving features, improving accuracy and efficiency.
Cross-Device Collaborative Test-Time Adaptation
·2757 words·13 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 South China University of Technology
CoLA: Collaborative Lifelong Adaptation boosts test-time adaptation efficiency by sharing domain knowledge across multiple devices, achieving significant accuracy gains with minimal computational over…
CRONOS: Enhancing Deep Learning with Scalable GPU Accelerated Convex Neural Networks
·2029 words·10 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Stanford University
CRONOS: Scaling convex neural network training to ImageNet!
Credal Deep Ensembles for Uncertainty Quantification
·3555 words·17 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 KU Leuven
Credal Deep Ensembles (CreDEs) improve uncertainty quantification in deep learning by predicting probability intervals, enhancing accuracy and calibration, particularly for out-of-distribution data.
Counter-Current Learning: A Biologically Plausible Dual Network Approach for Deep Learning
·1908 words·9 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Cornell University
Biologically inspired Counter-Current Learning (CCL) uses dual networks for deep learning, offering comparable performance to other biologically plausible algorithms while enhancing biological realism…
Convolutions and More as Einsum: A Tensor Network Perspective with Advances for Second-Order Methods
·8742 words·42 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Vector Institute
This paper accelerates second-order optimization in CNNs by 4.5x, using a novel tensor network representation that simplifies convolutions and reduces memory overhead.
ControlSynth Neural ODEs: Modeling Dynamical Systems with Guaranteed Convergence
·2928 words·14 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Southeast University
ControlSynth Neural ODEs (CSODEs) guarantee convergence in complex dynamical systems via tractable linear inequalities, improving neural ODE modeling.
Constrained Diffusion Models via Dual Training
·2031 words·10 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 University of Pennsylvania
Constrained diffusion models, trained via a novel dual approach, achieve optimal trade-offs between data fidelity and user-defined distribution constraints, enabling fairer and more controlled data ge…
Constant Acceleration Flow
·3228 words·16 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 Korea University
Constant Acceleration Flow (CAF) drastically accelerates image generation in diffusion models by leveraging a constant acceleration equation, outperforming state-of-the-art methods in both speed and q…
Consistency Models for Scalable and Fast Simulation-Based Inference
·3014 words·15 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 University of Stuttgart
CMPE: a new conditional sampler for SBI, achieves fast few-shot inference with an unconstrained architecture, outperforming current state-of-the-art algorithms on various benchmarks.
Connectivity Shapes Implicit Regularization in Matrix Factorization Models for Matrix Completion
·4538 words·22 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 Shanghai Jiao Tong University
Data connectivity profoundly shapes implicit regularization in matrix factorization for matrix completion, transitioning from low nuclear norm to low rank solutions as data shifts from disconnected to…
Conformalized Time Series with Semantic Features
·1569 words·8 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 UC Los Angeles
Conformalized Time Series with Semantic Features (CT-SSF) significantly improves time-series forecasting by dynamically weighting latent semantic features, achieving greater prediction efficiency whil…
Conformalized Multiple Testing after Data-dependent Selection
·2034 words·10 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
🏢 Nankai University
This paper introduces Selective Conformal P-Value (SCPV), a novel method for controlling FDR in conformalized multiple testing after data-dependent selection, offering a unified theoretical framework …
Conformalized Credal Set Predictors
·2420 words·12 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 LMU Munich, MCML
Conformal prediction empowers robust credal set predictions, handling aleatoric and epistemic uncertainties in classification, guaranteed to be valid with high probability!
Conformal Prediction for Class-wise Coverage via Augmented Label Rank Calibration
·4855 words·23 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Washington State University
RC3P, a novel algorithm, significantly reduces prediction set sizes in class-conditional conformal prediction while guaranteeing class-wise coverage, even on imbalanced datasets.
Confidence Calibration of Classifiers with Many Classes
·6165 words·29 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 IRT SystemX
Boost multi-class classifier calibration by cleverly transforming the problem into a single binary calibration task!
CondTSF: One-line Plugin of Dataset Condensation for Time Series Forecasting
·3169 words·15 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Shanghai Jiao Tong University
CondTSF: One-line plugin for time series forecasting dataset condensation, boosting performance at low condensation ratios.