Skip to main content

Posters

2024

Adversarially Robust Multi-task Representation Learning
·252 words·2 mins· loading · loading
Machine Learning Transfer Learning 🏢 Johns Hopkins University
Multi-task learning boosts adversarial robustness in transfer learning by leveraging diverse source data to build a shared representation, enabling effective learning in data-scarce target tasks, as p…
Adversarially Robust Dense-Sparse Tradeoffs via Heavy-Hitters
·388 words·2 mins· loading · loading
AI Generated AI Theory Robustness 🏢 Carnegie Mellon University
Improved adversarially robust streaming algorithms for L_p estimation are presented, surpassing previous state-of-the-art space bounds and disproving the existence of inherent barriers.
Adversarially Robust Decision Transformer
·2778 words·14 mins· loading · loading
AI Theory Robustness 🏢 University College London
Adversarially Robust Decision Transformer (ARDT) enhances offline RL robustness against powerful adversaries by conditioning policies on minimax returns, achieving superior worst-case performance.
Adversarial Schrödinger Bridge Matching
·3165 words·15 mins· loading · loading
Computer Vision Image Generation 🏢 Skoltech
Accelerate Schrödinger Bridge Matching with Discrete-time IMF using only a few steps, achieving comparable results to existing hundred-step methods via D-GAN implementation.
Adversarial Representation Engineering: A General Model Editing Framework for Large Language Models
·1740 words·9 mins· loading · loading
Natural Language Processing Large Language Models 🏢 Peking University
Adversarial Representation Engineering (ARE) offers a unified, interpretable approach for editing large language models (LLMs) by using a representation sensor as an editing oracle, enhancing model sa…
Adversarial Moment-Matching Distillation of Large Language Models
·2972 words·14 mins· loading · loading
AI Generated Natural Language Processing Large Language Models 🏢 SI-TECH Information Technology
Boosting LLM efficiency, this study introduces adversarial moment-matching distillation, outperforming existing methods by matching action-value moments for superior knowledge transfer and achieving s…
Advancing Training Efficiency of Deep Spiking Neural Networks through Rate-based Backpropagation
·2050 words·10 mins· loading · loading
Machine Learning Deep Learning 🏢 Zhejiang University
Rate-based backpropagation boosts deep spiking neural network training efficiency by leveraging rate coding, achieving comparable performance to BPTT with reduced complexity.
Advancing Tool-Augmented Large Language Models: Integrating Insights from Errors in Inference Trees
·1987 words·10 mins· loading · loading
Natural Language Processing Large Language Models 🏢 National Key Laboratory for Novel Software Technology, Nanjing University
TP-LLaMA boosts tool-augmented LLMs by optimizing inference trajectories using preference learning from both successful and failed attempts, achieving superior performance and efficiency.
Advancing Open-Set Domain Generalization Using Evidential Bi-Level Hardest Domain Scheduler
·3296 words·16 mins· loading · loading
Machine Learning Meta Learning 🏢 Karlsruhe Institute of Technology
EBiL-HaDS, a novel adaptive domain scheduler, significantly boosts open-set domain generalization by strategically sequencing training domains based on model reliability assessments.
Advancing Fine-Grained Classification by Structure and Subject Preserving Augmentation
·4124 words·20 mins· loading · loading
AI Generated Computer Vision Image Classification 🏢 Reichman University
SaSPA, a novel data augmentation method, boosts fine-grained visual classification accuracy by generating diverse, class-consistent synthetic images using structural and subject-preserving techniques.
Advancing Cross-domain Discriminability in Continual Learning of Vision-Language Models
·2348 words·12 mins· loading · loading
AI Generated Natural Language Processing Vision-Language Models 🏢 Greater Bay Area Institute for Innovation, Hunan University
RAIL, a novel continual learning method for vision-language models, tackles catastrophic forgetting and maintains zero-shot abilities without domain-identity hints or reference data. Using a recursiv…
AdvAD: Exploring Non-Parametric Diffusion for Imperceptible Adversarial Attacks
·2221 words·11 mins· loading · loading
Computer Vision Adversarial Attacks 🏢 Guangdong Key Lab of Information Security
AdvAD: A non-parametric diffusion process crafts imperceptible adversarial examples by subtly guiding an initial noise towards a target distribution, achieving high attack success rates with minimal p…
ADOPT: Modified Adam Can Converge with Any $eta_2$ with the Optimal Rate
·1889 words·9 mins· loading · loading
Machine Learning Deep Learning 🏢 University of Tokyo
ADOPT, a novel adaptive gradient method, achieves optimal convergence rates without restrictive assumptions, unlike Adam, significantly improving deep learning optimization.
Adjust Pearson's $r$ to Measure Arbitrary Monotone Dependence
·1286 words·7 mins· loading · loading
AI Generated AI Theory Optimization 🏢 Beijing University of Posts and Telecommunications
Researchers refine Pearson’s correlation coefficient to precisely measure arbitrary monotone dependence, expanding its applicability beyond linear relationships.
AdjointDEIS: Efficient Gradients for Diffusion Models
·1797 words·9 mins· loading · loading
Computer Vision Face Recognition 🏢 Clarkson University
AdjointDEIS: Efficient gradients for diffusion models via bespoke ODE solvers, simplifying backpropagation and improving guided generation.
Addressing Spectral Bias of Deep Neural Networks by Multi-Grade Deep Learning
·2240 words·11 mins· loading · loading
Machine Learning Deep Learning 🏢 Department of Mathematics and Statistics, Old Dominion University
Multi-Grade Deep Learning (MGDL) conquers spectral bias in deep neural networks by incrementally learning low-frequency components, ultimately capturing high-frequency features through composition.
Addressing Spatial-Temporal Heterogeneity: General Mixed Time Series Analysis via Latent Continuity Recovery and Alignment
·4775 words·23 mins· loading · loading
AI Generated Machine Learning Self-Supervised Learning 🏢 College of Control Science and Engineering, Zhejiang University, China
MiTSformer, a novel framework, recovers latent continuous variables from discrete data to enable complete spatial-temporal modeling of mixed time series, achieving state-of-the-art performance on mult…
Addressing Hidden Confounding with Heterogeneous Observational Datasets for Recommendation
·2627 words·13 mins· loading · loading
AI Generated Machine Learning Meta Learning 🏢 Peking University
MetaDebias tackles hidden confounding in recommender systems using heterogeneous observational data, achieving state-of-the-art performance without expensive RCT data.
Addressing Bias in Online Selection with Limited Budget of Comparisons
·2019 words·10 mins· loading · loading
AI Theory Optimization 🏢 ENSAE
This paper introduces efficient algorithms for online selection with a budget constraint when comparing candidates from different groups has a cost, improving fairness and efficiency.
Addressing Asynchronicity in Clinical Multimodal Fusion via Individualized Chest X-ray Generation
·3843 words·19 mins· loading · loading
AI Applications Healthcare 🏢 Hong Kong Polytechnic University
DDL-CXR dynamically generates up-to-date chest X-ray image representations using latent diffusion models, effectively addressing asynchronous multimodal clinical data for improved prediction.