Skip to main content

Posters

2024

UltraPixel: Advancing Ultra High-Resolution Image Synthesis to New Peaks
·3265 words·16 mins· loading · loading
Computer Vision Image Generation 🏢 Hong Kong University of Science and Technology
UltraPixel generates high-quality images at various resolutions (1K-6K) efficiently using cascade diffusion models, achieving state-of-the-art performance.
Ultrafast classical phylogenetic method beats large protein language models on variant effect prediction
·2536 words·12 mins· loading · loading
AI Generated AI Theory Optimization 🏢 UC Berkeley
A revolutionary ultrafast phylogenetic method outperforms protein language models in variant effect prediction by efficiently estimating amino acid substitution rates from massive datasets.
UGC: Universal Graph Coarsening
·2262 words·11 mins· loading · loading
Machine Learning Deep Learning 🏢 Yardi School of Artificial Intelligence
UGC: Blazing-fast graph coarsening for big data, preserving key insights across diverse graph types.
UDPM: Upsampling Diffusion Probabilistic Models
·3261 words·16 mins· loading · loading
AI Generated Computer Vision Image Generation 🏢 Tel Aviv University
UDPM: Upsampling Diffusion Probabilistic Models achieves high-quality image generation with fewer computations by incorporating downsampling and upsampling within the diffusion process.
UDON: Universal Dynamic Online distillatioN for generic image representations
·2160 words·11 mins· loading · loading
Computer Vision Image Representation Learning 🏢 Czech Technical University in Prague
UDON: a novel multi-teacher online distillation method creates highly efficient universal image embeddings by dynamically transferring domain-specific knowledge and adapting to imbalanced data.
UDC: A Unified Neural Divide-and-Conquer Framework for Large-Scale Combinatorial Optimization Problems
·5789 words·28 mins· loading · loading
AI Generated AI Theory Optimization 🏢 School of System Design and Intelligent Manufacturing, Southern University of Science and Technology
A unified neural divide-and-conquer framework (UDC) achieves superior performance on large-scale combinatorial optimization problems by employing a novel Divide-Conquer-Reunion training method and a h…
U-DiTs: Downsample Tokens in U-Shaped Diffusion Transformers
·2151 words·11 mins· loading · loading
Computer Vision Image Generation 🏢 Peking University
U-DiT: Revolutionizing diffusion transformers with a U-Net design and token downsampling for superior image generation and drastically reduced computation cost.
Typicalness-Aware Learning for Failure Detection
·2037 words·10 mins· loading · loading
Computer Vision Failure Detection 🏢 Tencent Youtu Lab
Typicalness-Aware Learning (TAL) improves failure detection by dynamically adjusting prediction confidence based on sample typicality, mitigating overconfidence and achieving significant performance g…
Two-way Deconfounder for Off-policy Evaluation in Causal Reinforcement Learning
·1675 words·8 mins· loading · loading
Machine Learning Reinforcement Learning 🏢 Shanghai University of Finance and Economics
Two-way Deconfounder tackles off-policy evaluation challenges by introducing a novel two-way unmeasured confounding assumption and a neural-network-based deconfounder, achieving consistent policy valu…
Twin-Merging: Dynamic Integration of Modular Expertise in Model Merging
·2935 words·14 mins· loading · loading
AI Generated Natural Language Processing Large Language Models 🏢 Huazhong University of Science and Technology
Twin-Merging dynamically merges modular model expertise, significantly improving multitask performance without retraining, and adapting to diverse data.
TurboHopp: Accelerated Molecule Scaffold Hopping with Consistency Models
·2569 words·13 mins· loading · loading
AI Applications Healthcare 🏢 AIGEN Sciences
TurboHopp: 30x faster 3D scaffold hopping with consistency models, boosting drug discovery!
TuneTables: Context Optimization for Scalable Prior-Data Fitted Networks
·4869 words·23 mins· loading · loading
AI Generated Machine Learning Deep Learning 🏢 New York University
TuneTables optimizes PFNs for scalability via context optimization, achieving state-of-the-art performance on large tabular datasets while using fewer parameters and reducing inference time.
TSDS: Data Selection for Task-Specific Model Finetuning
·2005 words·10 mins· loading · loading
Natural Language Processing Large Language Models 🏢 University of Wisconsin-Madison
TSDS: A novel framework selects optimal training data for efficient large language model finetuning using only a few examples, boosting performance.
Truthfulness of Calibration Measures
·337 words·2 mins· loading · loading
AI Theory Optimization 🏢 UC Berkeley
Researchers developed Subsampled Smooth Calibration Error (SSCE), a new truthful calibration measure for sequential prediction, solving the problem of existing measures being easily gamed.
Truthful High Dimensional Sparse Linear Regression
·282 words·2 mins· loading · loading
AI Theory Privacy 🏢 King Abdullah University of Science and Technology
This paper presents a novel, truthful, and privacy-preserving mechanism for high-dimensional sparse linear regression, incentivizing data contribution while safeguarding individual privacy.
Truth is Universal: Robust Detection of Lies in LLMs
·4200 words·20 mins· loading · loading
Natural Language Processing Large Language Models 🏢 Heidelberg University
LLM lie detectors fail to generalize; this paper presents a robust method achieving 94% accuracy by identifying a universal two-dimensional truth subspace, separating true/false statements across vari…
Truncated Variance Reduced Value Iteration
·1418 words·7 mins· loading · loading
Machine Learning Reinforcement Learning 🏢 Stanford University
Faster algorithms for solving discounted Markov Decision Processes (DMDPs) are introduced, achieving near-optimal sample and time complexities, especially in the sample setting and improving runtimes …
TripletCLIP: Improving Compositional Reasoning of CLIP via Synthetic Vision-Language Negatives
·3187 words·15 mins· loading · loading
Multimodal Learning Vision-Language Models 🏢 Arizona State University
TripletCLIP boosts CLIP’s compositional reasoning by cleverly generating synthetic hard negative image-text pairs, achieving over 9% absolute improvement on SugarCrepe.
Tri-Level Navigator: LLM-Empowered Tri-Level Learning for Time Series OOD Generalization
·1858 words·9 mins· loading · loading
Machine Learning Few-Shot Learning 🏢 Tongji University
LLM-powered Tri-level learning framework enhances time series OOD generalization.
TreeVI: Reparameterizable Tree-structured Variational Inference for Instance-level Correlation Capturing
·1694 words·8 mins· loading · loading
Machine Learning Variational Inference 🏢 School of Computer Science and Engineering, Sun Yat-Sen University
TreeVI: Scalable tree-structured variational inference captures instance-level correlations for improved model accuracy.