Search Results for author: Prakhar Ganesh

Found 8 papers, 2 papers with code

An Empirical Investigation into Benchmarking Model Multiplicity for Trustworthy Machine Learning: A Case Study on Image Classification

no code implementations24 Nov 2023 Prakhar Ganesh

Our work stands out by offering a one-stop empirical benchmark of multiplicity across various dimensions of model design and its impact on a diverse set of trustworthy metrics.

Benchmarking Image Classification +1

On The Impact of Machine Learning Randomness on Group Fairness

no code implementations9 Jul 2023 Prakhar Ganesh, Hongyan Chang, Martin Strobel, Reza Shokri

We investigate the impact on group fairness of different sources of randomness in training neural networks.

Fairness

HiKonv: High Throughput Quantized Convolution With Novel Bit-wise Management and Computation

no code implementations28 Dec 2021 Xinheng Liu, Yao Chen, Prakhar Ganesh, Junhao Pan, JinJun Xiong, Deming Chen

Quantization for Convolutional Neural Network (CNN) has shown significant progress with the intention of reducing the cost of computation and storage with low-bitwidth data inputs.

Management Quantization

YOLO-ReT: Towards High Accuracy Real-time Object Detection on Edge GPUs

1 code implementation26 Oct 2021 Prakhar Ganesh, Yao Chen, Yin Yang, Deming Chen, Marianne Winslett

Performance of object detection models has been growing rapidly on two major fronts, model accuracy and efficiency.

object-detection Real-Time Object Detection +1

Free Lunch for Co-Saliency Detection: Context Adjustment

no code implementations4 Aug 2021 Lingdong Kong, Prakhar Ganesh, Tan Wang, Junhao Liu, Le Zhang, Yao Chen

We hope that the scale, diversity, and quality of our dataset can benefit researchers in this area and beyond.

counterfactual Saliency Detection +1

Compressing Large-Scale Transformer-Based Models: A Case Study on BERT

no code implementations27 Feb 2020 Prakhar Ganesh, Yao Chen, Xin Lou, Mohammad Ali Khan, Yin Yang, Hassan Sajjad, Preslav Nakov, Deming Chen, Marianne Winslett

Pre-trained Transformer-based models have achieved state-of-the-art performance for various Natural Language Processing (NLP) tasks.

Model Compression

VLSTM: Very Long Short-Term Memory Networks for High-Frequency Trading

no code implementations5 Sep 2018 Prakhar Ganesh, Puneet Rakheja

One of the most sought after forms of electronic trading is high-frequency trading (HFT), typically known for microsecond sensitive changes, which results in a tremendous amount of data.

Time Series Time Series Forecasting +1

Cannot find the paper you are looking for? You can Submit a new open access paper.