Skip to main content

🏢 Beijing University of Posts and Telecommunications

Rethinking the Power of Timestamps for Robust Time Series Forecasting: A Global-Local Fusion Perspective
·2114 words·10 mins· loading · loading
AI Generated AI Applications Finance 🏢 Beijing University of Posts and Telecommunications
GLAFF: A novel framework that significantly improves time series forecasting robustness by fusing global timestamp information with local observations, achieving 12.5% average performance enhancement.
Rethinking No-reference Image Exposure Assessment from Holism to Pixel: Models, Datasets and Benchmarks
·2343 words·11 mins· loading · loading
Computer Vision Image Generation 🏢 Beijing University of Posts and Telecommunications
Revolutionizing image exposure assessment, Pixel-level IEA Network (P-IEANet) achieves state-of-the-art performance with a novel pixel-level approach, a new dataset (IEA40K), and a benchmark of 19 met…
Rethinking Decoders for Transformer-based Semantic Segmentation: Compression is All You Need
·2306 words·11 mins· loading · loading
Computer Vision Image Segmentation 🏢 Beijing University of Posts and Telecommunications
DEPICT: A new white-box decoder for Transformer-based semantic segmentation, achieving better performance with fewer parameters by leveraging the principle of compression and connecting Transformer de…
Physics-Constrained Comprehensive Optical Neural Networks
·1493 words·8 mins· loading · loading
Computer Vision Image Classification 🏢 Beijing University of Posts and Telecommunications
Physics-constrained learning significantly boosts optical neural network accuracy by addressing systematic physical errors, achieving state-of-the-art results on image classification tasks.
Lumina-Next : Making Lumina-T2X Stronger and Faster with Next-DiT
·2931 words·14 mins· loading · loading
Multimodal Learning Multimodal Generation 🏢 Beijing University of Posts and Telecommunications
Lumina-Next supercharges image generation: faster, more efficient, and better resolution with new architecture and sampling techniques.
LT-Defense: Searching-free Backdoor Defense via Exploiting the Long-tailed Effect
·2148 words·11 mins· loading · loading
Natural Language Processing Large Language Models 🏢 Beijing University of Posts and Telecommunications
LT-Defense: a searching-free backdoor defense for language models leveraging the long-tailed effect of poisoned data. It achieves 98% accuracy across 1440 models with less than 1% time cost of existin…
Localize, Understand, Collaborate: Semantic-Aware Dragging via Intention Reasoner
·2916 words·14 mins· loading · loading
AI Generated Computer Vision Image Generation 🏢 Beijing University of Posts and Telecommunications
LucidDrag: Semantic-aware dragging transforms image editing with an intention reasoner and collaborative guidance, achieving superior accuracy, image fidelity, and semantic diversity.
Free-Rider and Conflict Aware Collaboration Formation for Cross-Silo Federated Learning
·2733 words·13 mins· loading · loading
AI Generated Machine Learning Federated Learning 🏢 Beijing University of Posts and Telecommunications
FedEgoists: A novel FL collaboration formation strategy mitigating free-riders & conflicts in cross-silo business settings, ensuring optimal coalition formation for improved model performance.
FM-Delta: Lossless Compression for Storing Massive Fine-tuned Foundation Models
·3523 words·17 mins· loading · loading
AI Generated Natural Language Processing Large Language Models 🏢 Beijing University of Posts and Telecommunications
FM-Delta: Lossless compression halves cloud storage for massive fine-tuned language models, saving costs without sacrificing accuracy.
Animal-Bench: Benchmarking Multimodal Video Models for Animal-centric Video Understanding
·2713 words·13 mins· loading · loading
Multimodal Learning Multimodal Understanding 🏢 Beijing University of Posts and Telecommunications
Animal-Bench, a new benchmark, comprehensively evaluates multimodal video models for animal-centric video understanding, featuring 13 diverse tasks across 7 animal categories and 819 species.
Adjust Pearson's $r$ to Measure Arbitrary Monotone Dependence
·1286 words·7 mins· loading · loading
AI Generated AI Theory Optimization 🏢 Beijing University of Posts and Telecommunications
Researchers refine Pearson’s correlation coefficient to precisely measure arbitrary monotone dependence, expanding its applicability beyond linear relationships.