Search Results for author: Hari Sowrirajan

Found 3 papers, 2 papers with code

MoCo-Pretraining Improves Representations and Transferability of Chest X-ray Models

no code implementations1 Jan 2021 Hari Sowrirajan, Jing Bo Yang, Andrew Y. Ng, Pranav Rajpurkar

Using 0. 1% of labeled training data, we find that a linear model trained on MoCo-pretrained representations outperforms one trained on representations without MoCo-pretraining by an AUC of 0. 096 (95% CI 0. 061, 0. 130), indicating that MoCo-pretrained representations are of higher quality.

Image Classification Transfer Learning

MoCo-CXR: MoCo Pretraining Improves Representation and Transferability of Chest X-ray Models

2 code implementations11 Oct 2020 Hari Sowrirajan, Jingbo Yang, Andrew Y. Ng, Pranav Rajpurkar

In this work, we propose MoCo-CXR, which is an adaptation of the contrastive learning method Momentum Contrast (MoCo), to produce models with better representations and initializations for the detection of pathologies in chest X-rays.

Contrastive Learning Image Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.