Skip to main content

🏢 Chinese University of Hong Kong, Shenzhen

Graph Classification via Reference Distribution Learning: Theory and Practice
·2262 words·11 mins· loading · loading
Machine Learning Deep Learning 🏢 Chinese University of Hong Kong, Shenzhen
GRDL: a novel graph classification method boasting 10x speed improvement over competitors, achieved by treating node embeddings as distributions and avoiding global pooling.
Disentangling Linear Quadratic Control with Untrusted ML Predictions
·1894 words·9 mins· loading · loading
AI Applications Robotics 🏢 Chinese University of Hong Kong, Shenzhen
DISC, a novel control policy, disentangles untrusted ML predictions to achieve near-optimal performance when accurate, while guaranteeing competitive ratio bounds even with significant prediction erro…
Boosting Graph Pooling with Persistent Homology
·2499 words·12 mins· loading · loading
Machine Learning Deep Learning 🏢 Chinese University of Hong Kong, Shenzhen
Boosting graph neural networks: Topology-Invariant Pooling (TIP) leverages persistent homology to enhance graph pooling, achieving consistent performance gains across diverse datasets.
BAdam: A Memory Efficient Full Parameter Optimization Method for Large Language Models
·2359 words·12 mins· loading · loading
Natural Language Processing Large Language Models 🏢 Chinese University of Hong Kong, Shenzhen
BAdam: A memory-efficient optimization method enabling full parameter fine-tuning of large language models using a block coordinate descent framework with Adam’s update rule, achieving comparable or s…
B-ary Tree Push-Pull Method is Provably Efficient for Distributed Learning on Heterogeneous Data
·1511 words·8 mins· loading · loading
Machine Learning Deep Learning 🏢 Chinese University of Hong Kong, Shenzhen
B-ary Tree Push-Pull (BTPP) achieves linear speedup for distributed learning on heterogeneous data, significantly outperforming state-of-the-art methods with minimal communication.