Search Results for author: Haoran Zhu

Found 6 papers, 1 papers with code

Robust Tiny Object Detection in Aerial Images amidst Label Noise

no code implementations16 Jan 2024 Haoran Zhu, Chang Xu, Wen Yang, Ruixiang Zhang, Yan Zhang, Gui-Song Xia

In this study, we address the intricate issue of tiny object detection under noisy label supervision.

Denoising Object +2

Understanding Why ViT Trains Badly on Small Datasets: An Intuitive Perspective

2 code implementations7 Feb 2023 Haoran Zhu, Boyuan Chen, Carter Yang

Vision transformer (ViT) is an attention neural network architecture that is shown to be effective for computer vision tasks.

Image Classification

TAME: Task Agnostic Continual Learning using Multiple Experts

no code implementations8 Oct 2022 Haoran Zhu, Maryam Majzoubi, Arihant Jain, Anna Choromanska

Our algorithm, which we call TAME (Task-Agnostic continual learning using Multiple Experts), automatically detects the shift in data distributions and switches between task expert networks in an online manner.

Continual Learning

A robust single-pixel particle image velocimetry based on fully convolutional networks with cross-correlation embedded

no code implementations31 Oct 2021 Qi Gao, Hongtao Lin, Han Tu, Haoran Zhu, Runjie Wei, Guoping Zhang, Xueming Shao

CC-FCN has two types of input layers, one is for the particle images, and the other is for the initial velocity field calculated using cross-correlation with a coarse resolution.

Super-Resolution

A Scalable MIP-based Method for Learning Optimal Multivariate Decision Trees

no code implementations NeurIPS 2020 Haoran Zhu, Pavankumar Murali, Dzung T. Phan, Lam M. Nguyen, Jayant R. Kalagnanam

Several recent publications report advances in training optimal decision trees (ODT) using mixed-integer programs (MIP), due to algorithmic advances in integer programming and a growing interest in addressing the inherent suboptimality of heuristic approaches such as CART.

Cannot find the paper you are looking for? You can Submit a new open access paper.