Paper

Attention W-Net: Improved Skip Connections for better Representations

Segmentation of macro and microvascular structures in fundoscopic retinal images plays a crucial role in the detection of multiple retinal and systemic diseases, yet it is a difficult problem to solve. Most neural network approaches face several issues such as lack of enough parameters, overfitting and/or incompatibility between internal feature-spaces. We propose Attention W-Net, a new U-Net based architecture for retinal vessel segmentation to address these problems. In this architecture, we have two main contributions: Attention Block and regularisation measures. Our Attention Block uses attention between encoder and decoder features, resulting in higher compatibility upon addition. Our regularisation measures include augmentation and modifications to the ResNet Block used, which greatly prevent overfitting. We observe an F1 and AUC of 0.8407 and 0.9833 on the DRIVE and 0.8174 and 0.9865 respectively on the CHASE-DB1 datasets - a sizeable improvement over its backbone as well as competitive performance among contemporary state-of-the-art methods.

Results in Papers With Code
(↓ scroll down to see all results)