no code implementations • 14 Jun 2020 • Andong Tan, Duc Tam Nguyen, Maximilian Dax, Matthias Nießner, Thomas Brox
Self-attention networks have shown remarkable progress in computer vision tasks such as image classification.
no code implementations • ICLR 2020 • Duc Tam Nguyen, Chaithanya Kumar Mummadi, Thi Phuong Nhung Ngo, Thi Hoai Phuong Nguyen, Laura Beggel, Thomas Brox
Deep neural networks (DNNs) have been shown to over-fit a dataset when being trained with noisy labels for a long enough time.
no code implementations • 28 Sep 2019 • Duc Tam Nguyen, Maximilian Dax, Chaithanya Kumar Mummadi, Thi Phuong Nhung Ngo, Thi Hoai Phuong Nguyen, Zhongyu Lou, Thomas Brox
Alternative unsupervised approaches rely on careful selection of multiple handcrafted saliency methods to generate noisy pseudo-ground-truth labels.
no code implementations • 1 Jun 2019 • Duc Tam Nguyen, Thi-Phuong-Nhung Ngo, Zhongyu Lou, Michael Klar, Laura Beggel, Thomas Brox
We consider the problem of training a model under the presence of label noise.
2 code implementations • ICLR 2019 • Duc Tam Nguyen, Zhongyu Lou, Michael Klar, Thomas Brox
Thus, due to the lack of representative data, the wide-spread discriminative approaches cannot cover such learning tasks, and rather generative models, which attempt to learn the input density of the normal cases, are used.
2 code implementations • ICLR 2019 • Duc Tam Nguyen, Zhongyu Lou, Michael Klar, Thomas Brox
In one-class-learning tasks, only the normal case (foreground) can be modeled with data, whereas the variation of all possible anomalies is too erratic to be described by samples.