Posters
2024
Conformalized Credal Set Predictors
·2420 words·12 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 LMU Munich, MCML
Conformal prediction empowers robust credal set predictions, handling aleatoric and epistemic uncertainties in classification, guaranteed to be valid with high probability!
Conformal Prediction for Class-wise Coverage via Augmented Label Rank Calibration
·4855 words·23 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Washington State University
RC3P, a novel algorithm, significantly reduces prediction set sizes in class-conditional conformal prediction while guaranteeing class-wise coverage, even on imbalanced datasets.
Conformal Inverse Optimization
·1650 words·8 mins·
loading
·
loading
AI Theory
Optimization
🏢 University of Toronto
Conformal inverse optimization learns uncertainty sets for parameters in optimization models, then solves a robust optimization model for high-quality, human-aligned decisions.
Conformal Classification with Equalized Coverage for Adaptively Selected Groups
·7699 words·37 mins·
loading
·
loading
AI Theory
Fairness
🏢 UC Los Angeles
This paper introduces AFCP, a novel conformal inference method that generates prediction sets with valid coverage conditional on adaptively selected features, achieving a practical balance between eff…
Conformal Alignment: Knowing When to Trust Foundation Models with Guarantees
·3577 words·17 mins·
loading
·
loading
Natural Language Processing
Question Answering
🏢 Department of Statistics, University of Chicago
Conformal Alignment certifies trustworthy foundation model outputs by guaranteeing a user-specified fraction meet alignment criteria, regardless of the model or data.
Confident Natural Policy Gradient for Local Planning in q_π-realizable Constrained MDPs
·227 words·2 mins·
loading
·
loading
Machine Learning
Reinforcement Learning
🏢 University of Alberta
Confident-NPG-CMDP: First primal-dual algorithm achieving polynomial sample complexity for solving constrained Markov decision processes (CMDPs) using function approximation and local access model.
Confidence Regulation Neurons in Language Models
·3393 words·16 mins·
loading
·
loading
AI Generated
Natural Language Processing
Large Language Models
🏢 ETH Zurich
LLMs regulate uncertainty via specialized ’entropy’ and ’token frequency’ neurons, impacting prediction confidence without directly altering logits.
Confidence Calibration of Classifiers with Many Classes
·6165 words·29 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 IRT SystemX
Boost multi-class classifier calibration by cleverly transforming the problem into a single binary calibration task!
CondTSF: One-line Plugin of Dataset Condensation for Time Series Forecasting
·3169 words·15 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Shanghai Jiao Tong University
CondTSF: One-line plugin for time series forecasting dataset condensation, boosting performance at low condensation ratios.
Conditional Outcome Equivalence: A Quantile Alternative to CATE
·2270 words·11 mins·
loading
·
loading
AI Theory
Causality
🏢 University of Bristol
Researchers introduce the Conditional Quantile Comparator (CQC) for analyzing heterogeneous treatment effects, offering an improved approach by combining the strengths of CATE and CQTE while overcomin…
Conditional Generative Models are Sufficient to Sample from Any Causal Effect Estimand
·3417 words·17 mins·
loading
·
loading
AI Theory
Causality
🏢 Purdue University
ID-GEN: Sample high-dimensional interventional distributions using any conditional generative model!
Conditional Controllable Image Fusion
·2561 words·13 mins·
loading
·
loading
Computer Vision
Image Fusion
🏢 College of Intelligence and Computing, Tianjin University
Conditional Controllable Fusion (CCF) achieves training-free, adaptable image fusion by dynamically injecting fusion conditions into a pre-trained denoising diffusion model.
Concentrate Attention: Towards Domain-Generalizable Prompt Optimization for Language Models
·3084 words·15 mins·
loading
·
loading
Natural Language Processing
Text Classification
🏢 Xi'an Jiaotong University
Boost language model performance across domains with ‘Concentration’: a new prompt optimization objective that prioritizes stable, deep-layer attention.
Con4m: Context-aware Consistency Learning Framework for Segmented Time Series Classification
·2306 words·11 mins·
loading
·
loading
Machine Learning
Deep Learning
🏢 Zhejiang University
Con4m, a novel consistency learning framework, leverages contextual information to effectively classify segmented time series with inconsistent boundary labels and varying durations of classes, signif…
Computing the Bias of Constant-step Stochastic Approximation with Markovian Noise
·2187 words·11 mins·
loading
·
loading
Machine Learning
Stochastic Approximation
🏢 Univ. Grenoble Alpes and Inria
New method quantifies & reduces bias in constant-step stochastic approximation algorithms with Markovian noise, improving accuracy and efficiency.
Computerized Adaptive Testing via Collaborative Ranking
·2165 words·11 mins·
loading
·
loading
AI Applications
Education
🏢 State Key Laboratory of Cognitive Intelligence
Collaborative Computerized Adaptive Testing (CCAT) improves student ranking accuracy in online exams by leveraging inter-student information to enhance ranking consistency.
Computational Aspects of Bayesian Persuasion under Approximate Best Response
·1555 words·8 mins·
loading
·
loading
AI Generated
AI Theory
Robustness
🏢 UC Berkeley
This paper presents efficient algorithms for Bayesian persuasion under approximate best response, offering polynomial-time solutions for specific cases and a quasi-polynomial-time approximation scheme…
Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference
·1804 words·9 mins·
loading
·
loading
Machine Learning
Gaussian Processes
🏢 Columbia University
Computation-Aware Gaussian Processes (CaGP) achieve linear-time inference and model selection, enabling efficient training of GPs on large datasets without compromising uncertainty quantification, sur…
Compressing Large Language Models using Low Rank and Low Precision Decomposition
·2393 words·12 mins·
loading
·
loading
AI Generated
Natural Language Processing
Large Language Models
🏢 Stanford University
CALDERA: a new post-training LLM compression algorithm achieving state-of-the-art zero-shot performance using low-rank, low-precision decomposition.
Compositional PAC-Bayes: Generalization of GNNs with persistence and beyond
·2208 words·11 mins·
loading
·
loading
AI Theory
Generalization
🏢 ETH Zurich
Novel compositional PAC-Bayes framework delivers data-dependent generalization bounds for persistence-enhanced Graph Neural Networks, improving model design and performance.