Skip to main content

🏢 Tianjin University

Voxel Proposal Network via Multi-Frame Knowledge Distillation for Semantic Scene Completion
·2307 words·11 mins· loading · loading
AI Generated Computer Vision 3D Vision 🏢 Tianjin University
VPNet, a novel semantic scene completion network, uses multi-frame knowledge distillation and confident voxel proposals to improve accuracy and handle dynamic aspects of 3D scenes from point clouds, a…
Virtual Scanning: Unsupervised Non-line-of-sight Imaging from Irregularly Undersampled Transients
·2316 words·11 mins· loading · loading
Computer Vision Image Generation 🏢 Tianjin University
Unsupervised learning framework enables high-fidelity non-line-of-sight (NLOS) imaging from irregularly undersampled transients, surpassing state-of-the-art methods in speed and robustness.
Unlock the Intermittent Control Ability of Model Free Reinforcement Learning
·2548 words·12 mins· loading · loading
Machine Learning Reinforcement Learning 🏢 Tianjin University
MARS, a novel plugin framework, unlocks model-free RL’s intermittent control ability by encoding action sequences into a compact latent space, improving learning efficiency and real-world robotic task…
Test-Time Dynamic Image Fusion
·3589 words·17 mins· loading · loading
AI Generated Computer Vision Image Fusion 🏢 Tianjin University
Test-Time Dynamic Image Fusion (TTD) paradigm provably improves image fusion by dynamically weighting source data based on their relative dominance, reducing generalization error without extra trainin…
Synergistic Dual Spatial-aware Generation of Image-to-text and Text-to-image
·2896 words·14 mins· loading · loading
Multimodal Learning Vision-Language Models 🏢 Tianjin University
Synergistic Dual Spatial-aware Generation boosts image-to-text and text-to-image accuracy using a novel 3D scene graph and dual learning framework.
Persistence Homology Distillation for Semi-supervised Continual Learning
·2659 words·13 mins· loading · loading
AI Generated Machine Learning Semi-Supervised Learning 🏢 Tianjin University
Persistence Homology Distillation (PsHD) leverages topological data analysis to robustly preserve structural information in semi-supervised continual learning, significantly outperforming existing met…
FUG: Feature-Universal Graph Contrastive Pre-training for Graphs with Diverse Node Features
·2145 words·11 mins· loading · loading
Machine Learning Self-Supervised Learning 🏢 Tianjin University
FUG: A new graph contrastive pre-training strategy solves GNN transferability issues across datasets with diverse node features, achieving comparable performance to retraining while significantly impr…
Exploitation of a Latent Mechanism in Graph Contrastive Learning: Representation Scattering
·1847 words·9 mins· loading · loading
Self-Supervised Learning 🏢 Tianjin University
SGRL, a novel graph contrastive learning framework, significantly boosts performance by leveraging the inherent ‘representation scattering’ mechanism and integrating graph topology, outperforming exis…