Search Results for author: Lin Zehui

Found 2 papers, 0 papers with code

DropAttention: A Regularization Method for Fully-Connected Self-Attention Networks

no code implementations25 Jul 2019 Lin Zehui, PengFei Liu, Luyao Huang, Junkun Chen, Xipeng Qiu, Xuanjing Huang

Variants dropout methods have been designed for the fully-connected layer, convolutional layer and recurrent layer in neural networks, and shown to be effective to avoid overfitting.

Cannot find the paper you are looking for? You can Submit a new open access paper.