Search Results for author: Jie Bu

Found 8 papers, 7 papers with code

Beyond Discriminative Regions: Saliency Maps as Alternatives to CAMs for Weakly Supervised Semantic Segmentation

no code implementations21 Aug 2023 M. Maruf, Arka Daw, Amartya Dutta, Jie Bu, Anuj Karpatne

Furthermore, we propose random cropping as a stochastic aggregation technique that improves the performance of saliency, making it a strong alternative to CAM for WS3.

Segmentation Weakly supervised Semantic Segmentation +1

Let There Be Order: Rethinking Ordering in Autoregressive Graph Generation

1 code implementation24 May 2023 Jie Bu, Kazi Sajeed Mehrab, Anuj Karpatne

Conditional graph generation tasks involve training a model to generate a graph given a set of input conditions.

Dimensionality Reduction Graph Generation

Mitigating Propagation Failures in Physics-informed Neural Networks using Retain-Resample-Release (R3) Sampling

1 code implementation5 Jul 2022 Arka Daw, Jie Bu, Sifan Wang, Paris Perdikaris, Anuj Karpatne

In this paper, we provide a novel perspective of failure modes of PINNs by hypothesizing that training PINNs relies on successful "propagation" of solution from initial and/or boundary condition points to interior points.

Quadratic Residual Networks: A New Class of Neural Networks for Solving Forward and Inverse Problems in Physics Involving PDEs

1 code implementation20 Jan 2021 Jie Bu, Anuj Karpatne

We propose quadratic residual networks (QRes) as a new type of parameter-efficient neural network architecture, by adding a quadratic residual term to the weighted sum of inputs before applying activation functions.

Efficient Neural Network

Beyond Observed Connections : Link Injection

1 code implementation2 Sep 2020 Jie Bu, M. Maruf, Arka Daw

In this paper, we proposed the \textit{link injection}, a novel method that helps any differentiable graph machine learning models to go beyond observed connections from the input data in an end-to-end learning fashion.

Link Prediction Node Classification

CoPhy-PGNN: Learning Physics-guided Neural Networks with Competing Loss Functions for Solving Eigenvalue Problems

1 code implementation2 Jul 2020 Mohannad Elhamod, Jie Bu, Christopher Singh, Matthew Redell, Abantika Ghosh, Viktor Podolskiy, Wei-Cheng Lee, Anuj Karpatne

Physics-guided Neural Networks (PGNNs) represent an emerging class of neural networks that are trained using physics-guided (PG) loss functions (capturing violations in network outputs with known physics), along with the supervision contained in data.

Physics-guided Design and Learning of Neural Networks for Predicting Drag Force on Particle Suspensions in Moving Fluids

1 code implementation6 Nov 2019 Nikhil Muralidhar, Jie Bu, Ze Cao, Long He, Naren Ramakrishnan, Danesh Tafti, Anuj Karpatne

In such situations, it is often useful to rely on machine learning methods to fill in the gap by learning a model of the complex physical process directly from simulation data.

Cannot find the paper you are looking for? You can Submit a new open access paper.