Search Results for author: Ga Wu

Found 14 papers, 4 papers with code

Data-centric Prediction Explanation via Kernelized Stein Discrepancy

no code implementations22 Mar 2024 Mahtab Sarvmaili, Hassan Sajjad, Ga Wu

Existing example-based prediction explanation methods often bridge test and training data points through the model's parameters or latent representations.

Within-basket Recommendation via Neural Pattern Associator

no code implementations25 Jan 2024 Kai Luo, Tianshu Shen, Lan Yao, Ga Wu, Aaron Liblong, Istvan Fehervari, Ruijian An, Jawad Ahmed, Harshit Mishra, Charu Pujari

Within-basket recommendation (WBR) refers to the task of recommending items to the end of completing a non-empty shopping basket during a shopping session.

Quantization

Self-supervised Representation Learning From Random Data Projectors

1 code implementation11 Oct 2023 Yi Sui, Tongzi Wu, Jesse C. Cresswell, Ga Wu, George Stein, Xiao Shi Huang, Xiaochen Zhang, Maksims Volkovs

Self-supervised representation learning~(SSRL) has advanced considerably by exploiting the transformation invariance assumption under artificially designed data augmentations.

Data Augmentation Representation Learning

fAux: Testing Individual Fairness via Gradient Alignment

no code implementations10 Oct 2022 Giuseppe Castiglione, Ga Wu, Christopher Srinivasa, Simon Prince

We propose a novel criterion for evaluating individual fairness and develop a practical testing method based on this criterion which we call fAux (pronounced fox).

Fairness

Scalable Whitebox Attacks on Tree-based Models

no code implementations31 Mar 2022 Giuseppe Castiglione, Gavin Ding, Masoud Hashemi, Christopher Srinivasa, Ga Wu

Adversarial robustness is one of the essential safety criteria for guaranteeing the reliability of machine learning models.

Adversarial Robustness

PUMA: Performance Unchanged Model Augmentation for Training Data Removal

no code implementations2 Mar 2022 Ga Wu, Masoud Hashemi, Christopher Srinivasa

It then complements the negative impact of removing marked data by reweighting the remaining data optimally.

Model Optimization

Multi-axis Attentive Prediction for Sparse EventData: An Application to Crime Prediction

1 code implementation5 Oct 2021 Yi Sui, Ga Wu, Scott Sanner

We additionally introduce a novel Frobenius norm-based contrastive learning objective to improve latent representational generalization. Empirically, we validate MAPSED on two publicly accessible urban crime datasets for spatiotemporal sparse event prediction, where MAPSED outperforms both classical and state-of-the-art deep learning models.

Contrastive Learning Crime Prediction

Attentive Autoencoders for Multifaceted Preference Learning in One-class Collaborative Filtering

no code implementations24 Oct 2020 Zheda Mai, Ga Wu, Kai Luo, Scott Sanner

In order to capture multifaceted user preferences, existing recommender systems either increase the encoding complexity or extend the latent representation dimension.

Collaborative Filtering Recommendation Systems

Noise Contrastive Estimation for Autoencoding-based One-Class Collaborative Filtering

no code implementations3 Aug 2020 Jin Peng Zhou, Ga Wu, Zheda Mai, Scott Sanner

One-class collaborative filtering (OC-CF) is a common class of recommendation problem where only the positive class is explicitly observed (e. g., purchases, clicks).

Collaborative Filtering

Scalable Planning with Deep Neural Network Learned Transition Models

no code implementations5 Apr 2019 Ga Wu, Buser Say, Scott Sanner

But there remains one major problem for the task of control -- how can we plan with deep network learned transition models without resorting to Monte Carlo Tree Search and other black-box transition model techniques that ignore model structure and do not easily extend to mixed discrete and continuous domains?

Aesthetic Features for Personalized Photo Recommendation

no code implementations31 Aug 2018 Yu Qing Zhou, Ga Wu, Scott Sanner, Putra Manggala

Many photography websites such as Flickr, 500px, Unsplash, and Adobe Behance are used by amateur and professional photography enthusiasts.

Collaborative Filtering Image Retrieval +1

Conditional Inference in Pre-trained Variational Autoencoders via Cross-coding

1 code implementation ICLR 2019 Ga Wu, Justin Domke, Scott Sanner

Variational Autoencoders (VAEs) are a popular generative model, but one in which conditional inference can be challenging.

Scalable Planning with Tensorflow for Hybrid Nonlinear Domains

no code implementations NeurIPS 2017 Ga Wu, Buser Say, Scott Sanner

Given recent deep learning results that demonstrate the ability to effectively optimize high-dimensional non-convex functions with gradient descent optimization on GPUs, we ask in this paper whether symbolic gradient optimization tools such as Tensorflow can be effective for planning in hybrid (mixed discrete and continuous) nonlinear domains with high dimensional state and action spaces?

Cannot find the paper you are looking for? You can Submit a new open access paper.