Search Results for author: Jie Lu

Found 44 papers, 19 papers with code

Group-Aware Coordination Graph for Multi-Agent Reinforcement Learning

1 code implementation17 Apr 2024 Wei Duan, Jie Lu, Junyu Xuan

To overcome these limitations, we present a novel approach to infer the Group-Aware Coordination Graph (GACG), which is designed to capture both the cooperation between agent pairs based on current observations and group-level dependencies from behaviour patterns observed across trajectories.

Decision Making Multi-agent Reinforcement Learning +3

On the Learnability of Out-of-distribution Detection

no code implementations7 Apr 2024 Zhen Fang, Yixuan Li, Feng Liu, Bo Han, Jie Lu

Based on this observation, we next give several necessary and sufficient conditions to characterize the learnability of OOD detection in some practical scenarios.

Learning Theory Out-of-Distribution Detection +2

Transformer-Lite: High-efficiency Deployment of Large Language Models on Mobile Phone GPUs

no code implementations29 Mar 2024 Luchang Li, Sheng Qian, Jie Lu, Lunxi Yuan, Rui Wang, Qin Xie

The Large Language Model (LLM) is widely employed for tasks such as intelligent assistants, text summarization, translation, and multi-modality on mobile phones.

Language Modelling Large Language Model +2

Inferring Latent Temporal Sparse Coordination Graph for Multi-Agent Reinforcement Learning

1 code implementation28 Mar 2024 Wei Duan, Jie Lu, Junyu Xuan

The LTS-CG leverages agents' historical observations to calculate an agent-pair probability matrix, where a sparse graph is sampled from and used for knowledge exchange between agents, thereby simultaneously capturing agent dependencies and relation uncertainty.

Graph Learning Multi-agent Reinforcement Learning +2

Layer-diverse Negative Sampling for Graph Neural Networks

no code implementations18 Mar 2024 Wei Duan, Jie Lu, Yu Guang Wang, Junyu Xuan

Experiments on various real-world graph datasets demonstrate the effectiveness of our approach in improving the diversity of negative samples and overall learning performance.

Online Boosting Adaptive Learning under Concept Drift for Multistream Classification

no code implementations17 Dec 2023 En Yu, Jie Lu, Bin Zhang, Guangquan Zhang

Specifically, OBAL operates in a dual-phase mechanism, in the first of which we design an Adaptive COvariate Shift Adaptation (AdaCOSA) algorithm to construct an initialized ensemble model using archived data from various source streams, thus mitigating the covariate shift while learning the dynamic correlations via an adaptive re-weighting strategy.

Meta OOD Learning for Continuously Adaptive OOD Detection

no code implementations ICCV 2023 Xinheng Wu, Jie Lu, Zhen Fang, Guangquan Zhang

To address CAOOD, we develop meta OOD learning (MOL) by designing a learning-to-adapt diagram such that a good initialized OOD detection model is learned during the training process.

Out of Distribution (OOD) Detection

Graph Convolutional Neural Networks with Diverse Negative Samples via Decomposed Determinant Point Processes

1 code implementation5 Dec 2022 Wei Duan, Junyu Xuan, Maoying Qiao, Jie Lu

However, there are more non-neighbour nodes in the whole graph, which provide diverse and useful information for the representation update.

Computational Efficiency Graph Representation Learning +2

Is Out-of-Distribution Detection Learnable?

no code implementations26 Oct 2022 Zhen Fang, Yixuan Li, Jie Lu, Jiahua Dong, Bo Han, Feng Liu

Based on this observation, we next give several necessary and sufficient conditions to characterize the learnability of OOD detection in some practical scenarios.

Learning Theory Out-of-Distribution Detection +2

Learning from the Dark: Boosting Graph Convolutional Neural Networks with Diverse Negative Samples

1 code implementation3 Oct 2022 Wei Duan, Junyu Xuan, Maoying Qiao, Jie Lu

An interesting way to understand GCNs is to think of them as a message passing mechanism where each node updates its representation by accepting information from its neighbours (also known as positive samples).

Representation Learning

Multi-class Classification with Fuzzy-feature Observations: Theory and Algorithms

1 code implementation9 Jun 2022 Guangzhi Ma, Jie Lu, Feng Liu, Zhen Fang, Guangquan Zhang

Hence, in this paper, we propose a novel framework to address a new realistic problem called multi-class classification with imprecise observations (MCIMO), where we need to train a classifier with fuzzy-feature observations.

Classification Multi-class Classification

Bayesian Transfer Learning: An Overview of Probabilistic Graphical Models for Transfer Learning

no code implementations27 Sep 2021 Junyu Xuan, Jie Lu, Guangquan Zhang

Transfer learning where the behavior of extracting transferable knowledge from the source domain(s) and reusing this knowledge to target domain has become a research area of great interest in the field of artificial intelligence.

Transfer Learning

Learning Bounds for Open-Set Learning

1 code implementation30 Jun 2021 Zhen Fang, Jie Lu, Anjin Liu, Feng Liu, Guangquan Zhang

In this paper, we target a more challenging and realistic setting: open-set learning (OSL), where there exist test samples from the classes that are unseen during training.

Learning Theory Open Set Learning +1

Meta Two-Sample Testing: Learning Kernels for Testing with Limited Data

1 code implementation NeurIPS 2021 Feng Liu, Wenkai Xu, Jie Lu, Danica J. Sutherland

In realistic scenarios with very limited numbers of data samples, however, it can be challenging to identify a kernel powerful enough to distinguish complex distributions.

Two-sample testing Vocal Bursts Valence Prediction

Automatic Learning to Detect Concept Drift

no code implementations4 May 2021 Hang Yu, Tianyu Liu, Jie Lu, Guangquan Zhang

Many methods have been proposed to detect concept drift, i. e., the change in the distribution of streaming data, due to concept drift causes a decrease in the prediction accuracy of algorithms.

Active Learning Meta-Learning

Decentralized Statistical Inference with Unrolled Graph Neural Networks

1 code implementation4 Apr 2021 He Wang, Yifei Shen, Ziyuan Wang, Dongsheng Li, Jun Zhang, Khaled B. Letaief, Jie Lu

In this paper, we investigate the decentralized statistical inference problem, where a network of agents cooperatively recover a (structured) vector from private noisy samples without centralized coordination.

Distributed Optimization with Coupling Constraints

no code implementations25 Feb 2021 Xuyang Wu, He Wang, Jie Lu

In this paper, we develop a novel distributed algorithm for addressing convex optimization with both nonlinear inequality and linear equality constraints, where the objective function can be a general nonsmooth convex function and all the constraints can be fully coupled.

Distributed Optimization Optimization and Control

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior

1 code implementation7 Feb 2021 Tianyu Liu, Jie Lu, Zheng Yan, Guangquan Zhang

By leveraging experience from previous tasks, meta-learning algorithms can achieve effective fast adaptation ability when encountering new tasks.

Meta-Learning

How does the Combined Risk Affect the Performance of Unsupervised Domain Adaptation Approaches?

no code implementations30 Dec 2020 Li Zhong, Zhen Fang, Feng Liu, Jie Lu, Bo Yuan, Guangquan Zhang

Experiments show that the proxy can effectively curb the increase of the combined risk when minimizing the source risk and distribution discrepancy.

Unsupervised Domain Adaptation

Interactive Steering of Hierarchical Clustering

no code implementations21 Sep 2020 Weikai Yang, Xiting Wang, Jie Lu, Wenwen Dou, Shixia Liu

The novelty of our approach includes 1) automatically constructing constraints for hierarchical clustering using knowledge (knowledge-driven) and intrinsic data distribution (data-driven), and 2) enabling the interactive steering of clustering through a visual interface (user-driven).

Clustering

Concept Drift Detection: Dealing with MissingValues via Fuzzy Distance Estimations

1 code implementation9 Aug 2020 Anjin Liu, Jie Lu, Guangquan Zhang

Our solution comprises a novel masked distance learning (MDL) algorithm to reduce the cumulative errors caused by iteratively estimating each missing value in an observation and a fuzzy-weighted frequency (FWF) method for identifying discrepancies in the data distribution.

Imputation

Learning from a Complementary-label Source Domain: Theory and Algorithms

1 code implementation4 Aug 2020 Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu

We consider two cases of this setting, one is that the source domain only contains complementary-label data (completely complementary unsupervised domain adaptation, CC-UDA), and the other is that the source domain has plenty of complementary-label data and a small amount of true-label data (partly complementary unsupervised domain adaptation, PC-UDA).

Unsupervised Domain Adaptation

Clarinet: A One-step Approach Towards Budget-friendly Unsupervised Domain Adaptation

1 code implementation29 Jul 2020 Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu

To mitigate this problem, we consider a novel problem setting where the classifier for the target domain has to be trained with complementary-label data from the source domain and unlabeled data from the target domain named budget-friendly UDA (BFUDA).

Unsupervised Domain Adaptation

Bridging the Theoretical Bound and Deep Algorithms for Open Set Domain Adaptation

no code implementations23 Jun 2020 Li Zhong, Zhen Fang, Feng Liu, Bo Yuan, Guangquan Zhang, Jie Lu

To achieve this aim, a previous study has proven an upper bound of the target-domain risk, and the open set difference, as an important term in the upper bound, is used to measure the risk on unknown target data.

Domain Adaptation Object Recognition

Learning under Concept Drift: A Review

no code implementations13 Apr 2020 Jie Lu, Anjin Liu, Fan Dong, Feng Gu, Joao Gama, Guangquan Zhang

To help researchers identify which research topics are significant and how to apply related techniques in data analysis tasks, it is necessary that a high quality, instructive review of current research developments and trends in the concept drift field is conducted.

Diverse Instances-Weighting Ensemble based on Region Drift Disagreement for Concept Drift Adaptation

no code implementations13 Apr 2020 Anjin Liu, Jie Lu, Guangquan Zhang

Concept drift refers to changes in the distribution of underlying data and is an inherent property of evolving data streams.

Ensemble Learning

ATL: Autonomous Knowledge Transfer from Many Streaming Processes

2 code implementations8 Oct 2019 Mahardhika Pratama, Marcus de Carvalho, Renchunzi Xie, Edwin Lughofer, Jie Lu

It automatically evolves its network structure from scratch with/without the presence of ground truth to overcome independent concept drifts in the source and target domain.

Online Domain Adaptation Transfer Learning

Wildly Unsupervised Domain Adaptation and Its Powerful and Efficient Solution

no code implementations25 Sep 2019 Feng Liu, Jie Lu, Bo Han, Gang Niu, Guangquan Zhang, Masashi Sugiyama

Hence, we consider a new, more realistic and more challenging problem setting, where classifiers have to be trained with noisy labeled data from SD and unlabeled data from TD---we name it wildly UDA (WUDA).

Unsupervised Domain Adaptation Wildly Unsupervised Domain Adaptation

Cross-domain Network Representations

no code implementations1 Aug 2019 Shan Xue, Jie Lu, Guangquan Zhang

By generating the random walks from a structural rich domain and transferring the knowledge on the random walks across domains, it enables a network representation for the structural scarce domain as well.

Transfer Learning

Open Set Domain Adaptation: Theoretical Bound and Algorithm

1 code implementation19 Jul 2019 Zhen Fang, Jie Lu, Feng Liu, Junyu Xuan, Guangquan Zhang

The aim of unsupervised domain adaptation is to leverage the knowledge in a labeled (source) domain to improve a model's learning performance with an unlabeled (target) domain -- the basic strategy being to mitigate the effects of discrepancies between the two distributions.

Unsupervised Domain Adaptation

Butterfly: One-step Approach towards Wildly Unsupervised Domain Adaptation

1 code implementation19 May 2019 Feng Liu, Jie Lu, Bo Han, Gang Niu, Guangquan Zhang, Masashi Sugiyama

Hence, we consider a new, more realistic and more challenging problem setting, where classifiers have to be trained with noisy labeled data from SD and unlabeled data from TD -- we name it wildly UDA (WUDA).

Unsupervised Domain Adaptation Wildly Unsupervised Domain Adaptation

A Choquet Fuzzy Integral Vertical Bagging Classifier for Mobile Telematics Data Analysis

no code implementations19 Mar 2019 Mohammad Siami, Mohsen Naderpour, Jie Lu

The application of mobile telematics has been explored in many areas, such as insurance and road safety.

Deep Uncertainty Quantification: A Machine Learning Approach for Weather Forecasting

3 code implementations22 Dec 2018 Bin Wang, Jie Lu, Zheng Yan, Huaishao Luo, Tianrui Li, Yu Zheng, Guangquan Zhang

We cast the weather forecasting problem as an end-to-end deep learning problem and solve it by proposing a novel negative log-likelihood error (NLE) loss function.

BIG-bench Machine Learning Uncertainty Quantification +1

Deep Surface Light Fields

no code implementations15 Oct 2018 Anpei Chen, Minye Wu, Yingliang Zhang, Nianyi Li, Jie Lu, Shenghua Gao, Jingyi Yu

A surface light field represents the radiance of rays originating from any points on the surface in any directions.

Data Compression Image Registration

Cooperative Hierarchical Dirichlet Processes: Superposition vs. Maximization

no code implementations18 Jul 2017 Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu

The cooperative hierarchical structure is a common and significant data structure observed in, or adopted by, many research areas, such as: text mining (author-paper-word) and multi-label classification (label-instance-feature).

Multi-Label Classification Topic Models

Scalable Inference for Nested Chinese Restaurant Process Topic Models

no code implementations23 Feb 2017 Jianfei Chen, Jun Zhu, Jie Lu, Shixia Liu

Finally, we propose an efficient distributed implementation of PCGS through vectorization, pre-processing, and a careful design of the concurrent data structures and communication strategy.

Topic Models Variational Inference

Heterogeneous domain adaptation: An unsupervised approach

no code implementations10 Jan 2017 Feng Liu, Guanquan Zhang, Jie Lu

To contribute to the research in this emerging field, this paper presents: (1) an unsupervised knowledge transfer theorem that guarantees the correctness of transferring knowledge; and (2) a principal angle-based metric to measure the distance between two pairs of domains: one pair comprises the original source and target domains and the other pair comprises two homogeneous representations of two domains.

text-classification Text Classification +2

Dependent Indian Buffet Process-based Sparse Nonparametric Nonnegative Matrix Factorization

no code implementations12 Jul 2015 Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo

Under this same framework, two classes of correlation function are proposed (1) using Bivariate beta distribution and (2) using Copula function.

Clustering Recommendation Systems

Infinite Author Topic Model based on Mixed Gamma-Negative Binomial Process

no code implementations30 Mar 2015 Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo

One branch of these works is the so-called Author Topic Model (ATM), which incorporates the authors's interests as side information into the classical topic model.

Information Retrieval Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.