Skip to main content

🏢 University of Notre Dame

Masked Hard-Attention Transformers Recognize Exactly the Star-Free Languages
·1697 words·8 mins· loading · loading
AI Generated Natural Language Processing AI Theory 🏢 University of Notre Dame
Masked hard-attention transformers, with strict masking, precisely capture star-free languages, matching the expressive power of linear temporal logic.
Graph Diffusion Transformers for Multi-Conditional Molecular Generation
·2735 words·13 mins· loading · loading
AI Applications Healthcare 🏢 University of Notre Dame
Graph Diffusion Transformer (Graph DiT) masters multi-conditional molecular generation by cleverly integrating property representations into a graph-dependent noise model, achieving superior performan…
GFT: Graph Foundation Model with Transferable Tree Vocabulary
·4791 words·23 mins· loading · loading
AI Generated Machine Learning Transfer Learning 🏢 University of Notre Dame
GFT: a novel graph foundation model using transferable computation trees as tokens, improving generalization and reducing negative transfer in graph learning.