no code implementations • 26 Jul 2021 • Qingyun She, Zhiqiang Wang, Junlin Zhang
For example, the continuous features are usually transformed to the power forms by adding a new feature to allow it to easily form non-linear functions of the feature.
3 code implementations • 26 Jul 2021 • Zhiqiang Wang, Qingyun She, PengTao Zhang, Junlin Zhang
In this paper, We propose a novel CTR Framework named ContextNet that implicitly models high-order feature interactions by dynamically refining each feature's embedding according to the input context.
Ranked #14 on Click-Through Rate Prediction on Criteo
14 code implementations • 9 Feb 2021 • Zhiqiang Wang, Qingyun She, Junlin Zhang
We also turn the feed-forward layer in DNN model into a mixture of addictive and multiplicative feature interactions by proposing MaskBlock in this paper.
Ranked #8 on Click-Through Rate Prediction on Criteo
no code implementations • 13 Sep 2020 • Tongwen Huang, Qingyun She, Junlin Zhang
Our proposed model uses the pre-trained Transformer as the base classifier to choose harder training sets to fine-tune and gains the benefits of both the pre-training language knowledge and boosting ensemble in NLP tasks.
3 code implementations • 6 Jul 2020 • Tongwen Huang, Qingyun She, Zhiqiang Wang, Junlin Zhang
Inspired by these observations, we propose a novel model named GateNet which introduces either the feature embedding gate or the hidden gate to the embedding layer or hidden layers of DNN CTR models, respectively.
Ranked #20 on Click-Through Rate Prediction on Criteo
1 code implementation • 23 Jun 2020 • Zhiqiang Wang, Qingyun She, PengTao Zhang, Junlin Zhang
Normalization has become one of the most fundamental components in many deep neural networks for machine learning tasks while deep neural network has also been widely used in CTR estimation field.