Skip to main content

🏢 Department of Computer Science, City University of Hong Kong

Provably Transformers Harness Multi-Concept Word Semantics for Efficient In-Context Learning
·1305 words·7 mins· loading · loading
Natural Language Processing Large Language Models 🏢 Department of Computer Science, City University of Hong Kong
Transformers excel at in-context learning (ICL), solving new tasks with just prompts. This paper provides a mathematical explanation, showing how transformers use multi-concept word semantics to achie…
Flatten Anything: Unsupervised Neural Surface Parameterization
·2390 words·12 mins· loading · loading
Computer Vision 3D Vision 🏢 Department of Computer Science, City University of Hong Kong
Flatten Anything Model (FAM) revolutionizes neural surface parameterization with unsupervised learning, handling complex topologies and unstructured data fully automatically.