Posters
2024
Sourcerer: Sample-based Maximum Entropy Source Distribution Estimation
·4767 words·23 mins·
loading
·
loading
AI Generated
Machine Learning
Deep Learning
π’ University of TΓΌbingen
Sourcerer: A novel sample-based method for maximum entropy source distribution estimation, resolving ill-posedness while maintaining simulation accuracy.
Source Code Foundation Models are Transferable Binary Analysis Knowledge Bases
·2889 words·14 mins·
loading
·
loading
Natural Language Processing
Text Summarization
π’ Purdue University
ProRec, a novel framework, bridges the binary-source semantic gap by using a binary-source encoder-decoder model and LLMs, achieving significant improvements in zero-shot binary summarization and func…
SongCreator: Lyrics-based Universal Song Generation
·3881 words·19 mins·
loading
·
loading
AI Generated
Speech and Audio
Music Generation
π’ Shenzhen International Graduate School, Tsinghua University
SongCreator: a novel AI system generates complete, high-quality songs from lyrics, surpassing existing methods in lyrics-to-song and lyrics-to-vocals generation.
Solving Zero-Sum Markov Games with Continous State via Spectral Dynamic Embedding
·391 words·2 mins·
loading
·
loading
Machine Learning
Reinforcement Learning
π’ Zhejiang University
SDEPO, a new natural policy gradient algorithm, efficiently solves zero-sum Markov games with continuous state spaces, achieving near-optimal convergence independent of state space cardinality.
Solving Sparse & High-Dimensional-Output Regression via Compression
·2050 words·10 mins·
loading
·
loading
AI Generated
Machine Learning
Optimization
π’ National University of Singapore
SHORE: a novel two-stage framework efficiently solves sparse & high-dimensional output regression, boosting interpretability and scalability.
Solving Minimum-Cost Reach Avoid using Reinforcement Learning
·2253 words·11 mins·
loading
·
loading
AI Generated
Machine Learning
Reinforcement Learning
π’ MIT
RC-PPO: Reinforcement learning solves minimum-cost reach-avoid problems with up to 57% lower costs!
Solving Inverse Problems via Diffusion Optimal Control
·2106 words·10 mins·
loading
·
loading
AI Theory
Optimization
π’ Yale University
Revolutionizing inverse problem solving, this paper introduces diffusion optimal control, a novel framework converting signal recovery into a discrete optimal control problem, surpassing limitations o…
SOI: Scaling Down Computational Complexity by Estimating Partial States of the Model
·2817 words·14 mins·
loading
·
loading
Computer Vision
Action Recognition
π’ Samsung AI Center Warsaw
Scattered Online Inference (SOI) drastically cuts down ANN computational complexity by leveraging data continuity and prediction seasonality, enabling faster real-time inference on low-power devices.
SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion
·5096 words·24 mins·
loading
·
loading
AI Generated
AI Applications
Finance
π’ Nanjing University
SOFTS: An efficient MLP-based model for multivariate time series forecasting using a novel STAR module for efficient channel interaction.
Soft-Label Integration for Robust Toxicity Classification
·2918 words·14 mins·
loading
·
loading
AI Generated
Natural Language Processing
Text Classification
π’ Northwestern University
Boosting toxicity classification robustness, this paper introduces a novel bi-level optimization framework integrating crowdsourced soft-labels and GroupDRO to enhance resistance against out-of-distri…
Soft Tensor Product Representations for Fully Continuous, Compositional Visual Representations
·8738 words·42 mins·
loading
·
loading
Computer Vision
Representation Learning
π’ UNSW, Sydney
Soft Tensor Product Representations (Soft TPRs) revolutionize compositional visual representation learning by seamlessly blending continuous vector spaces and compositional structures, leading to supe…
Soft Superpixel Neighborhood Attention
·3657 words·18 mins·
loading
·
loading
AI Generated
Computer Vision
Image Segmentation
π’ Purdue University
Soft Superpixel Neighborhood Attention (SNA) optimally denoises images by incorporating superpixel probabilities into an attention module, outperforming traditional methods.
Soft Prompt Threats: Attacking Safety Alignment and Unlearning in Open-Source LLMs through the Embedding Space
·2028 words·10 mins·
loading
·
loading
Natural Language Processing
Large Language Models
π’ Technical University of Munich
Open-source LLMs are vulnerable to embedding space attacks, which efficiently bypass safety mechanisms and enable data extraction, even after unlearning.
Soft ascent-descent as a stable and flexible alternative to flooding
·2106 words·10 mins·
loading
·
loading
Machine Learning
Deep Learning
π’ Osaka University
Soft ascent-descent (SoftAD) improves test accuracy and generalization by softening the flooding method, offering competitive accuracy with reduced loss and model complexity.
SocialGPT: Prompting LLMs for Social Relation Reasoning via Greedy Segment Optimization
·2449 words·12 mins·
loading
·
loading
Multimodal Learning
Vision-Language Models
π’ Harvard University
SocialGPT cleverly leverages Vision Foundation Models and Large Language Models for zero-shot social relation reasoning, achieving competitive results and offering interpretable outputs via prompt opt…
SnapKV: LLM Knows What You are Looking for Before Generation
·2730 words·13 mins·
loading
·
loading
AI Generated
Natural Language Processing
Large Language Models
π’ University of Illinois Urbana-Champaign
SnapKV: Slashing LLM memory usage & boosting speed via smart KV cache compression!
Smoothie: Label Free Language Model Routing
·3245 words·16 mins·
loading
·
loading
AI Generated
Natural Language Processing
Large Language Models
π’ Stanford University
SMOOTHIE: Label-free LLM routing achieves up to 10% accuracy gains by using a latent variable model to estimate LLM quality without labeled data.
Smoothed Online Classification can be Harder than Batch Classification
·302 words·2 mins·
loading
·
loading
AI Generated
AI Theory
Optimization
π’ University of Michigan
Smoothed online classification can be harder than batch classification when label spaces are unbounded, challenging existing assumptions in machine learning.
Smoothed Energy Guidance: Guiding Diffusion Models with Reduced Energy Curvature of Attention
·2786 words·14 mins·
loading
·
loading
Computer Vision
Image Generation
π’ University of Washington
Smoothed Energy Guidance (SEG) improves unconditional image generation by reducing self-attention’s energy curvature, leading to higher-quality outputs with fewer artifacts.
Smoke and Mirrors in Causal Downstream Tasks
·2586 words·13 mins·
loading
·
loading
AI Theory
Causality
π’ Institute of Science and Technology Austria
AI for science faces hidden biases in causal inference; this paper reveals these flaws using ant behavior data, introducing ISTAnt benchmark, and provides guidelines for more accurate causal AI.