no code implementations • 11 Apr 2024 • Hongrui Chen, Xingchen Liu, Levent Burak Kara
The neural network takes as input the local coordinates within a cell to represent the density distribution within a cell, as well as the global coordinates of each cell to design spatially varying microstructure cells.
no code implementations • 12 Feb 2024 • Hongrui Chen, Lexing Ying
Diffusion models have achieved huge empirical success in data generation tasks.
no code implementations • 26 Jan 2024 • Baoyuan Wu, Hongrui Chen, Mingda Zhang, Zihao Zhu, Shaokui Wei, Danni Yuan, Mingli Zhu, Ruotong Wang, Li Liu, Chao Shen
We hope that our efforts could build a solid foundation of backdoor learning to facilitate researchers to investigate existing algorithms, develop more innovative algorithms, and explore the intrinsic mechanism of backdoor learning.
no code implementations • 13 Dec 2023 • Baoyuan Wu, Shaokui Wei, Mingli Zhu, Meixi Zheng, Zihao Zhu, Mingda Zhang, Hongrui Chen, Danni Yuan, Li Liu, Qingshan Liu
Adversarial phenomenon has been widely observed in machine learning (ML) systems, especially in those using deep neural networks, describing that ML systems may produce inconsistent and incomprehensible predictions with humans at some particular cases.
no code implementations • 5 Jun 2023 • Hongrui Chen, Jihao Long, Lei Wu
We prove that if $\beta$ is independent of the input dimension $d$, then functions in the RKHS can be learned efficiently under the $L^\infty$ norm, i. e., the sample complexity depends polynomially on $d$.
no code implementations • 1 Jun 2023 • Ruotong Wang, Hongrui Chen, Zihao Zhu, Li Liu, Baoyuan Wu
Deep neural networks (DNNs) can be manipulated to exhibit specific behaviors when exposed to specific trigger patterns, without affecting their performance on benign samples, dubbed \textit{backdoor attack}.
1 code implementation • 17 May 2023 • Hongrui Chen, Aditya Joglekar, Levent Burak Kara
We employ the strain energy field calculated on the initial design domain as an additional conditioning field input to the neural network throughout the optimization.
no code implementations • 9 May 2023 • Hongrui Chen, Jihao Long, Lei Wu
The first application is to study learning functions in $\mathcal{F}_{p,\pi}$ with RFMs.
1 code implementation • 6 May 2023 • Aditya Joglekar, Hongrui Chen, Levent Burak Kara
We show that using a suitable Fourier Features neural network architecture and hyperparameters, the density field approximation neural network can learn the weights to represent the optimal density field for the given domain and boundary conditions, by directly backpropagating the loss gradient through the displacement field approximation neural network, and unlike prior work there is no requirement of a sensitivity filter, optimality criterion method, or a separate training of density network in each topology optimization iteration.
no code implementations • 3 Nov 2022 • Hongrui Chen, Holden Lee, Jianfeng Lu
We give an improved theoretical analysis of score-based generative modeling.
no code implementations • 4 Oct 2022 • Hongrui Chen, Aditya Joglekar, Kate S. Whitefoot, Levent Burak Kara
Through training, the network learns a material density and segment classification in the continuous 3D space.
1 code implementation • 25 Jun 2022 • Baoyuan Wu, Hongrui Chen, Mingda Zhang, Zihao Zhu, Shaokui Wei, Danni Yuan, Chao Shen
However, we find that the evaluations of new methods are often unthorough to verify their claims and accurate performance, mainly due to the rapid development, diverse settings, and the difficulties of implementation and reproducibility.