no code implementations • 10 Sep 2023 • Deguang Kong, Abhay Jha, Lei Yun
This paper presents a novel teachable conversation interaction system that is capable of learning users preferences from cold start by gradually adapting to personal preferences.
no code implementations • 10 Sep 2023 • Deguang Kong, Daniel Zhou, Zhiheng Huang, Steph Sigalas
Existing neural relevance models do not give enough consideration for query and item context information which diversifies the search results to adapt for personal preference.
no code implementations • 13 Feb 2023 • Danilo Ribeiro, Shen Wang, Xiaofei Ma, Henry Zhu, Rui Dong, Deguang Kong, Juliette Burger, Anjelica Ramos, William Wang, Zhiheng Huang, George Karypis, Bing Xiang, Dan Roth
We introduce STREET, a unified multi-task and multi-domain natural language reasoning and explanation benchmark.
no code implementations • 27 Dec 2022 • Deguang Kong, Miao Lu, Konstantin Shmakov, Jian Yang
Consensus clustering aggregates partitions in order to find a better fit by reconciling clustering results from different sources/executions.
no code implementations • 26 Dec 2022 • Deguang Kong, Konstantin Shmakov, Jian Yang
In computational advertising, a challenging problem is how to recommend the bid for advertisers to achieve the best return on investment (ROI) given budget constraint.
no code implementations • 26 Dec 2022 • Deguang Kong, Konstantin Shmakov, Jian Yang
In cost-per-click (CPC) or cost-per-impression (CPM) advertising campaigns, advertisers always run the risk of spending the budget without getting enough conversions.
1 code implementation • 12 Oct 2022 • Xiyang Hu, Xinchi Chen, Peng Qi, Deguang Kong, Kunlun Liu, William Yang Wang, Zhiheng Huang
Multilingual information retrieval (IR) is challenging since annotated training data is costly to obtain in many languages.
2 code implementations • 17 Feb 2020 • Wei Deng, Junwei Pan, Tian Zhou, Deguang Kong, Aaron Flores, Guang Lin
To address the issue of significantly increased serving delay and high memory usage for ad serving in production, this paper presents \emph{DeepLight}: a framework to accelerate the CTR predictions in three aspects: 1) accelerate the model inference via explicitly searching informative feature interactions in the shallow component; 2) prune redundant layers and parameters at intra-layer and inter-layer level in the DNN component; 3) promote the sparsity of the embedding layer to preserve the most discriminant signals.
Ranked #7 on Click-Through Rate Prediction on Avazu
no code implementations • ICLR 2018 • Anderson Y. Zhang, Miao Lu, Deguang Kong, Jimmy Yang
However, their performance is easily undermined by the existence of change points and anomaly points, two structures commonly observed in real data, but rarely considered in the aforementioned methods.
no code implementations • 16 Aug 2017 • Dawei Li, Xiaolong Wang, Deguang Kong
As observed in the experiment, DeepRebirth achieves more than 3x speed-up and 2. 5x run-time memory saving on GoogLeNet with only 0. 4% drop of top-5 accuracy on ImageNet.
no code implementations • NeurIPS 2014 • Deguang Kong, Ryohei Fujimaki, Ji Liu, Feiping Nie, Chris Ding
Group lasso is widely used to enforce the structural sparsity, which achieves the sparsity at inter-group level.