Posters
2024
Leveraging Visual Tokens for Extended Text Contexts in Multi-Modal Learning
·2448 words·12 mins·
loading
·
loading
Multimodal Learning
Vision-Language Models
🏢 Show Lab, National University of Singapore
Visual tokens boost long-text multi-modal models!
Leveraging Tumor Heterogeneity: Heterogeneous Graph Representation Learning for Cancer Survival Prediction in Whole Slide Images
·2284 words·11 mins·
loading
·
loading
AI Applications
Healthcare
🏢 Jiangsu Provincial Joint International Research Laboratory of Medical Information Processing
ProtoSurv uses heterogeneous graph representation learning to predict cancer survival more accurately by incorporating tumor heterogeneity and tissue spatial relationships from WSIs.
Leveraging Separated World Model for Exploration in Visually Distracted Environments
·2320 words·11 mins·
loading
·
loading
Machine Learning
Reinforcement Learning
🏢 School of Artificial Intelligence, Nanjing University, China
SeeX, a novel bi-level optimization framework, effectively tackles the challenge of exploration in visually cluttered environments by training a separated world model to extract relevant information a…
Leveraging partial stragglers within gradient coding
·1641 words·8 mins·
loading
·
loading
Machine Learning
Federated Learning
🏢 Iowa State University
New gradient coding protocols efficiently leverage partial results from slow worker nodes, accelerating distributed training by approximately 2x and significantly improving accuracy.
Leveraging Hallucinations to Reduce Manual Prompt Dependency in Promptable Segmentation
·3134 words·15 mins·
loading
·
loading
AI Generated
Computer Vision
Image Segmentation
🏢 School of Electronic Engineering and Computer Science, Queen Mary University of London
ProMaC leverages MLLM hallucinations in an iterative framework to generate precise prompts for accurate object segmentation, minimizing manual prompt dependency.
Leveraging Environment Interaction for Automated PDDL Translation and Planning with Large Language Models
·1918 words·10 mins·
loading
·
loading
Natural Language Processing
Large Language Models
🏢 University of British Columbia
This paper presents a fully automated method for PDDL translation and planning using LLMs and environment interaction, achieving a 66% success rate on challenging PDDL domains.
Leveraging Drift to Improve Sample Complexity of Variance Exploding Diffusion Models
·1640 words·8 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 John Hopcroft Center for Computer Science
Drifted VESDE: Faster convergence, efficient sampling for variance-exploding diffusion models!
Leveraging Contrastive Learning for Enhanced Node Representations in Tokenized Graph Transformers
·1604 words·8 mins·
loading
·
loading
Machine Learning
Representation Learning
🏢 Huazhong University of Science and Technology
GCFormer, a novel graph Transformer, enhances node representation learning by employing a hybrid token generator and contrastive learning, outperforming existing methods on various datasets.
Leveraging an ECG Beat Diffusion Model for Morphological Reconstruction from Indirect Signals
·3805 words·18 mins·
loading
·
loading
AI Generated
AI Applications
Healthcare
🏢 Ecole Polytechnique
BeatDiff, a lightweight diffusion model, reconstructs ECG morphology from indirect signals, enabling noise removal, artifact reduction, missing-lead recovery, and anomaly detection.
Lever LM: Configuring In-Context Sequence to Lever Large Vision Language Models
·2923 words·14 mins·
loading
·
loading
Multimodal Learning
Vision-Language Models
🏢 Southeast University
Lever-LM configures effective in-context demonstrations for large vision-language models using a small language model, significantly improving their performance on visual question answering and image …
LESS: Label-Efficient and Single-Stage Referring 3D Segmentation
·2019 words·10 mins·
loading
·
loading
Natural Language Processing
Vision-Language Models
🏢 College of Computer Science and Software Engineering, Shenzhen University
LESS achieves state-of-the-art Referring 3D Segmentation using only binary masks, significantly reducing labeling effort and improving efficiency with a novel single-stage pipeline.
Length Optimization in Conformal Prediction
·2805 words·14 mins·
loading
·
loading
AI Generated
AI Applications
Healthcare
🏢 University of Pennsylvania
Conformal Prediction with Length Optimization (CPL) achieves shorter, conditionally valid prediction sets by optimizing length while ensuring coverage across various covariate shifts.
LeDex: Training LLMs to Better Self-Debug and Explain Code
·3820 words·18 mins·
loading
·
loading
AI Generated
Natural Language Processing
Large Language Models
🏢 Purdue University
LEDEX: A novel training framework significantly boosts LLMs’ code self-debugging by using automated data collection, supervised fine-tuning, and reinforcement learning, leading to more accurate code a…
Least Squares Regression Can Exhibit Under-Parameterized Double Descent
·3874 words·19 mins·
loading
·
loading
AI Generated
AI Theory
Generalization
🏢 Applied Math, Yale University
Under-parameterized linear regression models can surprisingly exhibit double descent, contradicting traditional bias-variance assumptions.
Learning-to-Cache: Accelerating Diffusion Transformer via Layer Caching
·2463 words·12 mins·
loading
·
loading
Computer Vision
Image Generation
🏢 National University of Singapore
Learning-to-Cache (L2C) dramatically accelerates diffusion transformers by intelligently caching layer computations, achieving significant speedups with minimal performance loss.
Learning-Augmented Priority Queues
·2516 words·12 mins·
loading
·
loading
AI Theory
Optimization
🏢 ENSAE, Ecole Polytechnique
This paper introduces learning-augmented priority queues, using predictions to boost efficiency and optimality, achieving significant performance gains over traditional methods.
Learning-Augmented Dynamic Submodular Maximization
·387 words·2 mins·
loading
·
loading
AI Theory
Optimization
🏢 Indian Institute of Technology Bombay
Leveraging predictions, this paper presents a novel algorithm for dynamic submodular maximization achieving significantly faster update times (O(poly(log n, log w, log k)) amortized) compared to exist…
Learning-Augmented Approximation Algorithms for Maximum Cut and Related Problems
·249 words·2 mins·
loading
·
loading
AI Theory
Optimization
🏢 Google Research
This paper shows how noisy predictions about optimal solutions can improve approximation algorithms for NP-hard problems like MAX-CUT, exceeding classical hardness bounds.
Learning-Augmented Algorithms with Explicit Predictors
·3004 words·15 mins·
loading
·
loading
AI Generated
AI Theory
Optimization
🏢 Bocconi University
This paper introduces a novel framework for learning-augmented algorithms that improves performance by integrating the learning process into the algorithm itself, rather than treating the predictor as…
Learning-Augmented Algorithms for the Bahncard Problem
·3280 words·16 mins·
loading
·
loading
AI Theory
Optimization
🏢 Zhejiang University
PFSUM, a novel learning-augmented algorithm, leverages short-term predictions to achieve superior performance in solving the Bahncard problem, outperforming existing methods with improved consistency …