Search Results for author: Sahil Garg

Found 21 papers, 4 papers with code

$\textbf{S}^2$IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting

no code implementations9 Mar 2024 Zijie Pan, Yushan Jiang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song

To this end, we propose Semantic Space Informed Prompt learning with LLM ($S^2$IP-LLM) to align the pre-trained semantic space with time series embeddings space and perform time series forecasting based on learned prompts from the joint space.

Time Series Time Series Forecasting

Structural Knowledge Informed Continual Multivariate Time Series Forecasting

no code implementations20 Feb 2024 Zijie Pan, Yushan Jiang, Dongjin Song, Sahil Garg, Kashif Rasul, Anderson Schneider, Yuriy Nevmyvaka

To address this issue, we propose a novel Structural Knowledge Informed Continual Learning (SKI-CL) framework to perform MTS forecasting within a continual learning paradigm, which leverages structural knowledge to steer the forecasting model toward identifying and adapting to different regimes, and selects representative MTS samples from each regime for memory replay.

Continual Learning Graph structure learning +2

Empowering Time Series Analysis with Large Language Models: A Survey

no code implementations5 Feb 2024 Yushan Jiang, Zijie Pan, Xikun Zhang, Sahil Garg, Anderson Schneider, Yuriy Nevmyvaka, Dongjin Song

Specifically, we first state the challenges and motivations of applying language models in the context of time series as well as brief preliminaries of LLMs.

Time Series Time Series Analysis

Structural block driven - enhanced convolutional neural representation for relation extraction

no code implementations21 Mar 2021 Dongsheng Wang, Prayag Tiwari, Sahil Garg, Hongyin Zhu, Peter Bruza

In this paper, we propose a novel lightweight relation extraction approach of structural block driven - convolutional neural learning.

Relation Relation Extraction +1

The Shift to 6G Communications: Vision and Requirements

no code implementations15 Oct 2020 Muhammad Waseem Akhtar, Syed Ali Hassan, Rizwan Ghaffar, Haejoon Jung, Sahil Garg, M. Shamim Hossain

The sixth-generation (6G) wireless communication network is expected to integrate the terrestrial, aerial, and maritime communications into a robust network which would be more reliable, fast, and can support a massive number of devices with ultra-low latency requirements.

BIG-bench Machine Learning Edge-computing +1

Deep Anomaly Detection for Time-series Data in Industrial IoT: A Communication-Efficient On-device Federated Learning Approach

no code implementations19 Jul 2020 Yi Liu, Sahil Garg, Jiangtian Nie, Yang Zhang, Zehui Xiong, Jiawen Kang, M. Shamim Hossain

Third, to adapt the proposed framework to the timeliness of industrial anomaly detection, we propose a gradient compression mechanism based on Top-\textit{k} selection to improve communication efficiency.

Anomaly Detection Federated Learning +2

An RNN-Survival Model to Decide Email Send Times

no code implementations21 Apr 2020 Harvineet Singh, Moumita Sinha, Atanu R. Sinha, Sahil Garg, Neha Banerjee

We posit that emails are likely to be opened sooner when send times are convenient for recipients, while for other send times, emails can get ignored.

Survival Analysis

Nearly-Unsupervised Hashcode Representations for Biomedical Relation Extraction

no code implementations IJCNLP 2019 Sahil Garg, Aram Galstyan, Greg Ver Steeg, Guillermo Cecchi

Recently, kernelized locality sensitive hashcodes have been successfully employed as representations of natural language text, especially showing high relevance to biomedical relation extraction tasks.

Relation Relation Extraction

Nearly-Unsupervised Hashcode Representations for Relation Extraction

no code implementations9 Sep 2019 Sahil Garg, Aram Galstyan, Greg Ver Steeg, Guillermo Cecchi

Recently, kernelized locality sensitive hashcodes have been successfully employed as representations of natural language text, especially showing high relevance to biomedical relation extraction tasks.

Relation Relation Extraction

Securing Fog-to-Things Environment Using Intrusion Detection System Based On Ensemble Learning

no code implementations30 Jan 2019 Poulmanogo Illy, Georges Kaddoum, Christian Miranda Moreira, Kuljeet Kaur, Sahil Garg

Many solutions proposed in the literature are reported to have high accuracy but are ineffective in real applications due to the non-representativity of the dataset used for training and evaluation of the underlying models.

Anomaly Detection Ensemble Learning +1

Building Models for Biopathway Dynamics Using Intrinsic Dimensionality Analysis

no code implementations29 Apr 2018 Emilia M. Wysocka, Valery Dzutsati, Tirthankar Bandyopadhyay, Laura Condon, Sahil Garg

In the paper, we elaborate on the reason of multidimensional analysis problem in the context of molecular signaling, and next, we introduce the model of choice, simulation details and obtained time series dynamics.

Dimensionality Reduction Time Series +1

Learning Non-Stationary Space-Time Models for Environmental Monitoring

no code implementations27 Apr 2018 Sahil Garg, Amarjeet Singh, Fabio Ramos

One of the primary aspects of sustainable development involves accurate understanding and modeling of environmental phenomena.

Persistent Monitoring of Stochastic Spatio-temporal Phenomena with a Small Team of Robots

no code implementations27 Apr 2018 Sahil Garg, Nora Ayanian

We propose an adaptive solution for the problem where stochastic real-world dynamics are modeled as a Gaussian Process (GP).

Efficiently Learning Nonstationary Gaussian Processes for Real World Impact

no code implementations27 Apr 2018 Sahil Garg

Therefore, for an efficient yet accurate inference, we propose to build an induced latent dynamics representation using a novel algorithm LISAL that adaptively maximizes entropy or mutual information on the induced latent dynamics and marginal likelihood of observed real dynamics in an iterative manner.

Gaussian Processes Informativeness

Adaptive Sensing for Learning Nonstationary Environment Models

no code implementations26 Apr 2018 Sahil Garg, Amarjeet Singh, Fabio Ramos

The core idea in LISAL is to learn two models using Gaussian processes (GPs) wherein the first is a nonstationary GP directly modeling the phenomenon.

Gaussian Processes

Modeling Psychotherapy Dialogues with Kernelized Hashcode Representations: A Nonparametric Information-Theoretic Approach

no code implementations26 Apr 2018 Sahil Garg, Irina Rish, Guillermo Cecchi, Palash Goyal, Sarik Ghazarian, Shuyang Gao, Greg Ver Steeg, Aram Galstyan

We also derive a novel lower bound on mutual information, used as a model-selection criterion favoring representations with better alignment between the utterances of participants in a collaborative dialogue setting, as well as higher predictability of the generated responses.

Computational Efficiency Dialogue Generation +1

Stochastic Learning of Nonstationary Kernels for Natural Language Modeling

no code implementations11 Jan 2018 Sahil Garg, Greg Ver Steeg, Aram Galstyan

Natural language processing often involves computations with semantic or syntactic graphs to facilitate sophisticated reasoning based on structural relationships.

Language Modelling

Kernelized Hashcode Representations for Relation Extraction

1 code implementation10 Nov 2017 Sahil Garg, Aram Galstyan, Greg Ver Steeg, Irina Rish, Guillermo Cecchi, Shuyang Gao

Here we propose to use random subspaces of KLSH codes for efficiently constructing an explicit representation of NLP structures suitable for general classification methods.

General Classification Relation +1

Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World

1 code implementation22 Jan 2017 Sahil Garg, Irina Rish, Guillermo Cecchi, Aurelie Lozano

In this paper, we focus on online representation learning in non-stationary environments which may require continuous adaptation of model architecture.

Dictionary Learning Hippocampus +2

Extracting Biomolecular Interactions Using Semantic Parsing of Biomedical Text

1 code implementation4 Dec 2015 Sahil Garg, Aram Galstyan, Ulf Hermjakob, Daniel Marcu

We advance the state of the art in biomolecular interaction extraction with three contributions: (i) We show that deep, Abstract Meaning Representations (AMR) significantly improve the accuracy of a biomolecular interaction extraction system when compared to a baseline that relies solely on surface- and syntax-based features; (ii) In contrast with previous approaches that infer relations on a sentence-by-sentence basis, we expand our framework to enable consistent predictions over sets of sentences (documents); (iii) We further modify and expand a graph kernel learning framework to enable concurrent exploitation of automatically induced AMR (semantic) and dependency structure (syntactic) representations.

Semantic Parsing Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.