Skip to main content

Text Generation

YouDream: Generating Anatomically Controllable Consistent Text-to-3D Animals
·2987 words·15 mins· loading · loading
Natural Language Processing Text Generation 🏢 University of Texas at Austin
YOUDREAM generates anatomically consistent, high-quality 3D animal models from text and 2D pose priors, pushing creative boundaries in text-to-3D generation.
Token Merging for Training-Free Semantic Binding in Text-to-Image Synthesis
·2366 words·12 mins· loading · loading
Natural Language Processing Text Generation 🏢 Nankai University
ToMe: a novel training-free method dramatically improves semantic binding in text-to-image synthesis by intelligently merging related tokens, ensuring accurate alignment between generated images and t…
Superposed Decoding: Multiple Generations from a Single Autoregressive Inference Pass
·2903 words·14 mins· loading · loading
Natural Language Processing Text Generation 🏢 University of Washington
Generate multiple text drafts from a single language model pass with Superposed Decoding, significantly boosting efficiency!
SpeechAlign: Aligning Speech Generation to Human Preferences
·1822 words·9 mins· loading · loading
Natural Language Processing Text Generation 🏢 Fudan University
SpeechAlign: Iteratively aligning speech generation models to human preferences via preference optimization, bridging distribution gaps for improved speech quality.
Simplified and Generalized Masked Diffusion for Discrete Data
·2082 words·10 mins· loading · loading
Natural Language Processing Text Generation 🏢 Google DeepMind
Simplified and generalized masked diffusion models achieve state-of-the-art results in discrete data generation, surpassing previous methods in text and image modeling.
Probing Social Bias in Labor Market Text Generation by ChatGPT: A Masked Language Model Approach
·3286 words·16 mins· loading · loading
AI Generated Natural Language Processing Text Generation 🏢 Department of Mathematical and Statistical Sciences, University of Alberta, Canada
ChatGPT amplifies gender bias in job applications, revealing AI’s potential to worsen labor market inequality.
Meta-Diffu$B$: A Contextualized Sequence-to-Sequence Text Diffusion Model with Meta-Exploration
·3164 words·15 mins· loading · loading
AI Generated Natural Language Processing Text Generation 🏢 University of Washington
Meta-DiffuB enhances sequence-to-sequence text diffusion models by using meta-exploration to learn a contextualized noise schedule, resulting in state-of-the-art performance.
Local to Global: Learning Dynamics and Effect of Initialization for Transformers
·2433 words·12 mins· loading · loading
AI Generated Natural Language Processing Text Generation 🏢 EPFL
Transformers’ learning dynamics depend heavily on initialization and Markovian data properties, leading to either global or local minima; this paper proves this, offers initialization guidelines, and …
Learning and Transferring Sparse Contextual Bigrams with Linear Transformers
·1445 words·7 mins· loading · loading
Natural Language Processing Text Generation 🏢 Princeton University
Linear transformers efficiently learn sparse contextual bigrams by leveraging both in-context and global information, achieving polynomial sample complexity.
Learnability Matters: Active Learning for Video Captioning
·2406 words·12 mins· loading · loading
Natural Language Processing Text Generation 🏢 Hangzhou Dianzi University
Active learning for video captioning is enhanced by a novel algorithm that prioritizes ’learnability’, diversity, and uncertainty to address annotation inconsistency.
LCGen: Mining in Low-Certainty Generation for View-consistent Text-to-3D
·2307 words·11 mins· loading · loading
Natural Language Processing Text Generation 🏢 Shanghai Engineering Research Center of AI & Robotics, Academy for Engineering & Technology, Fudan University
LCGen: A novel method for view-consistent text-to-3D generation, resolving the ‘Janus Problem’ by strategically using low-certainty priors to align viewpoints and optimize the generation process.
Incentivizing Quality Text Generation via Statistical Contracts
·1392 words·7 mins· loading · loading
Natural Language Processing Text Generation 🏢 Technion - Israel Institute of Technology
Cost-robust contracts, inspired by statistical hypothesis tests, incentivize quality in LLM text generation, overcoming the moral hazard of pay-per-token models.
IF-Font: Ideographic Description Sequence-Following Font Generation
·3165 words·15 mins· loading · loading
AI Generated Natural Language Processing Text Generation 🏢 Fuzhou University
IF-Font: Revolutionary font generation using Ideographic Description Sequences (IDS) to surpass state-of-the-art methods in style transfer, especially for unique styles.
Fast Sampling via Discrete Non-Markov Diffusion Models with Predetermined Transition Time
·2326 words·11 mins· loading · loading
Natural Language Processing Text Generation 🏢 UC Los Angeles
Accelerated discrete diffusion model sampling is achieved via novel discrete non-Markov diffusion models (DNDM) with predetermined transition times, enabling a training-free algorithm that significant…
Discrete Modeling via Boundary Conditional Diffusion Processes
·2908 words·14 mins· loading · loading
AI Generated Natural Language Processing Text Generation 🏢 Harbin Institute of Technology
Bridging the gap between continuous diffusion models and discrete data, this work introduces a novel boundary-conditional approach achieving superior performance in language modeling and image generat…
ContextCite: Attributing Model Generation to Context
·7666 words·36 mins· loading · loading
AI Generated Natural Language Processing Text Generation 🏢 MIT
CONTEXTCITE pinpoints which parts of a given context led a language model to generate a specific statement, improving model verification and response quality.
AP-Adapter: Improving Generalization of Automatic Prompts on Unseen Text-to-Image Diffusion Models
·2738 words·13 mins· loading · loading
Natural Language Processing Text Generation 🏢 State Key Laboratory for Novel Software Technology, Nanjing University
AP-Adapter boosts text-to-image diffusion model generalization by using a two-stage prompt optimization method that leverages large language models and inter-model differences.
AdaNovo: Towards Robust mph{De Novo} Peptide Sequencing in Proteomics against Data Biases
·1791 words·9 mins· loading · loading
Natural Language Processing Text Generation 🏢 Westlake University
AdaNovo tackles data biases in de novo peptide sequencing by using Conditional Mutual Information, significantly improving PTM identification and overall accuracy.