no code implementations • 1 Apr 2023 • Hieu H. Pham, Ha Q. Nguyen, Hieu T. Nguyen, Linh T. Le, Khanh Lam
We conducted a prospective study to measure the clinical impact of an explainable machine learning system on interobserver agreement in chest radiograph interpretation.
no code implementations • 29 Mar 2023 • Hieu H. Pham, Khiem H. Le, Tuan V. Tran, Ha Q. Nguyen
The work discusses the use of machine learning algorithms for anomaly detection in medical image analysis and how the performance of these algorithms depends on the number of annotators and the quality of labels.
no code implementations • 11 Sep 2022 • Thao T. B. Nguyen, Tam M. Vo, Thang V. Nguyen, Hieu H. Pham, Ha Q. Nguyen
Our best model (CheXpert-pretrained EfficientNet-B2) yields an F1-score of 0. 6989 (95% CI 0. 6740, 0. 7240), AUC of 0. 7912, sensitivity of 0. 7064 and specificity of 0. 8760 for the abnormal diagnosis in general.
no code implementations • 6 Aug 2022 • Hieu H. Pham, Ha Q. Nguyen, Hieu T. Nguyen, Linh T. Le, Lam Khanh
For the localization task with 14 types of lesions, our free-response receiver operating characteristic (FROC) analysis showed that the VinDr-CXR achieved a sensitivity of 80. 2% at the rate of 1. 0 false-positive lesion identified per scan.
no code implementations • MIDL 2019 • Dat T. Ngo, Thao T. B. Nguyen, Hieu T. Nguyen, Dung B. Nguyen, Ha Q. Nguyen, Hieu H. Pham
In particular, deep convolutional neural networks (D-CNNs) have been key players and were adopted by the medical imaging community to assist clinicians and medical experts in disease diagnosis and treatment.
no code implementations • 20 Mar 2022 • Binh T. Dao, Thang V. Nguyen, Hieu H. Pham, Ha Q. Nguyen
This work aims at developing and validating a precise, fast multi-phase classifier to recognize three main types of contrast phases in abdominal CT scans.
1 code implementation • 20 Mar 2022 • Hieu T. Nguyen, Ha Q. Nguyen, Hieu H. Pham, Khanh Lam, Linh T. Le, Minh Dao, Van Vu
Mammography, or breast X-ray, is the most widely used imaging modality to detect cancer and other breast diseases.
no code implementations • 20 Mar 2022 • Sam B. Tran, Huyen T. X. Nguyen, Chi Phan, Hieu H. Pham, Ha Q. Nguyen
Image augmentation techniques have been widely investigated to improve the performance of deep learning (DL) algorithms on mammography classification tasks.
1 code implementation • 20 Mar 2022 • Hieu H. Pham, Ngoc H. Nguyen, Thanh T. Tran, Tuan N. M. Nguyen, Ha Q. Nguyen
To the best of our knowledge, this is the first and largest pediatric CXR dataset containing lesion-level annotations and image-level labels for the detection of multiple findings and diseases.
1 code implementation • 20 Mar 2022 • Khiem H. Le, Tuan V. Tran, Hieu H. Pham, Hieu T. Nguyen, Tung T. Le, Ha Q. Nguyen
As a result, the labeled data may contain a variety of human biases with a high rate of disagreement among annotators, which significantly affect the performance of supervised machine learning algorithms.
no code implementations • 8 Dec 2021 • Huyen T. X. Nguyen, Sam B. Tran, Dung B. Nguyen, Hieu H. Pham, Ha Q. Nguyen
The experimental results demonstrate that the proposed approach outperforms the single-view classification approach on two benchmark datasets by huge F1-score margins (+5% on the internal dataset and +10% on the DDSM dataset).
no code implementations • 14 Aug 2021 • Hieu H. Pham, Dung V. Do, Ha Q. Nguyen
This challenge raises the need for an automated and efficient approach to classifying body parts from X-ray scans.
no code implementations • 14 Aug 2021 • Thanh T. Tran, Hieu H. Pham, Thang V. Nguyen, Tung T. Le, Hieu T. Nguyen, Ha Q. Nguyen
Chest radiograph (CXR) interpretation in pediatric patients is error-prone and requires a high level of understanding of radiologic expertise.
1 code implementation • 3 Jul 2021 • Hoang C. Nguyen, Tung T. Le, Hieu H. Pham, Ha Q. Nguyen
We introduce a new benchmark dataset, namely VinDr-RibCXR, for automatic segmentation and labeling of individual ribs from chest X-ray (CXR) scans.
1 code implementation • 24 Jun 2021 • Hieu T. Nguyen, Hieu H. Pham, Nghia T. Nguyen, Ha Q. Nguyen, Thang Q. Huynh, Minh Dao, Van Vu
It demonstrates an area under the receiver operating characteristic curve (AUROC) of 88. 61% (95% CI 87. 19%, 90. 02%) for the image-level classification task and a mean average precision (mAP@0. 5) of 33. 56% for the lesion-level localization task.
3 code implementations • 30 Dec 2020 • Ha Q. Nguyen, Khanh Lam, Linh T. Le, Hieu H. Pham, Dat Q. Tran, Dung B. Nguyen, Dung D. Le, Chi M. Pham, Hang T. T. Tong, Diep H. Dinh, Cuong D. Do, Luu T. Doan, Cuong N. Nguyen, Binh T. Nguyen, Que V. Nguyen, Au D. Hoang, Hien N. Phan, Anh T. Nguyen, Phuong H. Ho, Dat T. Ngo, Nghia T. Nguyen, Nhan T. Nguyen, Minh Dao, Van Vu
Most of the existing chest X-ray datasets include labels from a list of findings without specifying their locations on the radiographs.
no code implementations • MIDL 2019 • Hieu H. Pham, Tung T. Le, Dat T. Ngo, Dat Q. Tran, Ha Q. Nguyen
The chest X-rays (CXRs) is one of the views most commonly ordered by radiologists (NHS), which is critical for diagnosis of many different thoracic diseases.
1 code implementation • MIDL 2019 • Nhan T. Nguyen, Dat Q. Tran, Nghia T. Nguyen, Ha Q. Nguyen
We validate the method on the recent RSNA Intracranial Hemorrhage Detection challenge and on the CQ500 dataset.
2 code implementations • 15 Nov 2019 • Hieu H. Pham, Tung T. Le, Dat Q. Tran, Dat T. Ngo, Ha Q. Nguyen
The performance is on average better than 2. 6 out of 3 other individual radiologists with a mean AUC of 0. 930, which ranks first on the CheXpert leaderboard at the time of writing this paper.
Ranked #2 on Multi-Label Classification on CheXpert
no code implementations • 6 Sep 2017 • Harshit Gupta, Kyong Hwan Jin, Ha Q. Nguyen, Michael T. McCann, Michael Unser
When the projector is replaced with a CNN, we propose a relaxed PGD, which always converges.
no code implementations • 16 May 2017 • Ha Q. Nguyen, Emrah Bostan, Michael Unser
We propose a data-driven algorithm for the maximum a posteriori (MAP) estimation of stochastic processes from noisy observations.