1 code implementation • 17 Aug 2023 • Mirazul Haque, Wei Yang
Then, through research studies, we provide insight into the design choices that can increase robustness of DyNNs against the attack generated using static model.
1 code implementation • 23 Jul 2023 • Jiangrui Zheng, Xueqing Liu, Guanqun Yang, Mirazul Haque, Xing Qian, Ravishka Rathnasuriya, Wei Yang, Girish Budhrani
We observe significant improvement in the models' conformity to content policies while having comparable scores on the original test data.
1 code implementation • 1 Jun 2023 • Mirazul Haque, Rutvij Shah, Simin Chen, Berrak Şişman, Cong Liu, Wei Yang
We show that popular ASR models like Speech2Text model and Whisper model have dynamic computation based on different inputs, causing dynamic efficiency.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +1
no code implementations • CVPR 2023 • Simin Chen, Hanlin Chen, Mirazul Haque, Cong Liu, Wei Yang
Recent advancements in deploying deep neural networks (DNNs) on resource-constrained devices have generated interest in input-adaptive dynamic neural networks (DyNNs).
1 code implementation • COLING 2022 • Guanqun Yang, Mirazul Haque, Qiaochu Song, Wei Yang, Xueqing Liu
Our experiments show that TestAug has three advantages over the existing work on behavioral testing: (1) TestAug can find more bugs than existing work; (2) The test cases in TestAug are more diverse; and (3) TestAug largely saves the manual efforts in creating the test suites.
no code implementations • 10 Oct 2022 • Simin Chen, Mirazul Haque, Cong Liu, Wei Yang
To ensure an AdNN satisfies the performance requirements of resource-constrained applications, it is essential to conduct performance testing to detect IDPBs in the AdNN.
no code implementations • 7 Oct 2022 • Simin Chen, Cong Liu, Mirazul Haque, Zihe Song, Wei Yang
Neural Machine Translation (NMT) systems have received much recent attention due to their human-level accuracy.
no code implementations • 19 Apr 2022 • Mirazul Haque, Christof J. Budnik, Wei Yang
These DNNs are vulnerable to adversarial perturbations and corruptions.
1 code implementation • CVPR 2022 • Simin Chen, Zihe Song, Mirazul Haque, Cong Liu, Wei Yang
To further understand such efficiency-oriented threats, we propose a new attack approach, NICGSlowDown, to evaluate the efficiency robustness of NICG models.
no code implementations • 12 Feb 2022 • Mirazul Haque, Yaswanth Yadlapalli, Wei Yang, Cong Liu
The test inputs generated by EREBA can increase the energy consumption of AdNNs by 2, 000% compared to the original inputs.
no code implementations • 29 Sep 2021 • Mirazul Haque, Simin Chen, Wasif Arman Haque, Cong Liu, Wei Yang
Unlike the memory cost, the energy consumption of the Neural ODEs during inference can be adaptive because of the adaptive nature of the ODE solvers.
no code implementations • 29 Sep 2021 • Simin Chen, Mirazul Haque, Zihe Song, Cong Liu, Wei Yang
To further the understanding of such efficiency-oriented threats and raise the community’s concern on the efficiency robustness of NMT systems, we propose a new attack approach, TranSlowDown, to test the efficiency robustness of NMT systems.
no code implementations • CVPR 2020 • Mirazul Haque, Anki Chauhan, Cong Liu, Wei Yang
With the increasing number of layers and parameters in neural networks, the energy consumption of neural networks has become a great concern to society, especially to users of handheld or embedded devices.