no code implementations • 19 Feb 2024 • Avinandan Bose, Simon Shaolei Du, Maryam Fazel
We study the problem of representation transfer in offline Reinforcement Learning (RL), where a learner has access to episodic data from a number of source tasks collected a priori, and aims to learn a shared representation to be used in finding a good policy for a target task.
no code implementations • 19 Dec 2023 • Avinandan Bose, Mihaela Curmei, Daniel L. Jiang, Jamie Morgenstern, Sarah Dean, Lillian J. Ratliff, Maryam Fazel
(ii) Suboptimal Local Solutions: The total loss (sum of loss functions across all users and all services) landscape is not convex even if the individual losses on a single service are convex, making it likely for the learning dynamics to get stuck in local minima.
no code implementations • 31 May 2022 • Avinandan Bose, Arunesh Sinha, Tien Mai
Distributionally robust optimization (DRO) has shown lot of promise in providing robustness in learning as well as sample based optimization problems.
1 code implementation • 24 Jan 2022 • Changyu Chen, Avinandan Bose, Shih-Fen Cheng, Arunesh Sinha
Recent work has used generative models (GANs in particular) for providing high-fidelity simulation of real-world systems.
no code implementations • 1 Dec 2021 • Avinandan Bose, Pradeep Varakantham
Owing to the benefits for customers (lower prices), drivers (higher revenues), aggregation companies (higher revenues) and the environment (fewer vehicles), on-demand ride pooling (e. g., Uber pool, Grab Share) has become quite popular.
no code implementations • 29 Nov 2021 • Avinandan Bose, Soumendu Sundar Mukherjee
Changepoint analysis deals with unsupervised detection and/or estimation of time-points in time-series data, when the distribution generating the data changes.
no code implementations • 7 Nov 2021 • Avinandan Bose, Aniket Das, Yatin Dandi, Piyush Rai
In this work, we propose a novel generative model that learns a flexible non-parametric prior over interpolation trajectories, conditioned on a pair of source and target images.
no code implementations • NeurIPS Workshop DLDE 2021 • Avinandan Bose, Aniket Das, Yatin Dandi, Piyush Rai
A range of applications require learning image generation models whose latent space effectively captures the high-level factors of variation in the data distribution, which can be judged by its ability to interpolate between images smoothly.