Search Results for author: Anthony Ferritto

Found 12 papers, 3 papers with code

SPARTAN: Sparse Hierarchical Memory for Parameter-Efficient Transformers

1 code implementation29 Nov 2022 Ameet Deshpande, Md Arafat Sultan, Anthony Ferritto, Ashwin Kalyan, Karthik Narasimhan, Avirup Sil

Fine-tuning pre-trained language models (PLMs) achieves impressive performance on a range of downstream tasks, and their sizes have consequently been getting bigger.

GAAMA 2.0: An Integrated System that Answers Boolean and Extractive Questions

no code implementations16 Jun 2022 Scott McCarley, Mihaela Bornea, Sara Rosenthal, Anthony Ferritto, Md Arafat Sultan, Avirup Sil, Radu Florian

Recent machine reading comprehension datasets include extractive and boolean questions but current approaches do not offer integrated support for answering both question types.

Machine Reading Comprehension

VAULT: VAriable Unified Long Text Representation for Machine Reading Comprehension

no code implementations ACL 2021 Haoyang Wen, Anthony Ferritto, Heng Ji, Radu Florian, Avirup Sil

Existing models on Machine Reading Comprehension (MRC) require complex model architecture for effectively modeling long texts with paragraph representation and classification, thereby making inference computationally inefficient for production use.

Machine Reading Comprehension Natural Questions

Towards building a Robust Industry-scale Question Answering System

no code implementations COLING 2020 Rishav Chakravarti, Anthony Ferritto, Bhavani Iyer, Lin Pan, Radu Florian, Salim Roukos, Avi Sil

Building on top of the powerful BERTQA model, GAAMA provides a ∼2. 0{\%} absolute boost in F1 over the industry-scale state-of-the-art (SOTA) system on NQ.

Data Augmentation Natural Questions +2

ARES: A Reading Comprehension Ensembling Service

no code implementations EMNLP 2020 Anthony Ferritto, Lin Pan, Rishav Chakravarti, Salim Roukos, Radu Florian, J. William Murdock, Avi Sil

We introduce ARES (A Reading Comprehension Ensembling Service): a novel Machine Reading Comprehension (MRC) demonstration system which utilizes an ensemble of models to increase F1 by 2. 3 points.

Machine Reading Comprehension Natural Questions +1

Ensembling Strategies for Answering Natural Questions

no code implementations30 Oct 2019 Anthony Ferritto, Lin Pan, Rishav Chakravarti, Salim Roukos, Radu Florian, J. William Murdock, Avirup Sil

Many of the top question answering systems today utilize ensembling to improve their performance on tasks such as the Stanford Question Answering Dataset (SQuAD) and Natural Questions (NQ) challenges.

Natural Questions Question Answering

Frustratingly Easy Natural Question Answering

no code implementations11 Sep 2019 Lin Pan, Rishav Chakravarti, Anthony Ferritto, Michael Glass, Alfio Gliozzo, Salim Roukos, Radu Florian, Avirup Sil

Existing literature on Question Answering (QA) mostly focuses on algorithmic novelty, data augmentation, or increasingly large pre-trained language models like XLNet and RoBERTa.

Data Augmentation Natural Questions +2

Span Selection Pre-training for Question Answering

1 code implementation ACL 2020 Michael Glass, Alfio Gliozzo, Rishav Chakravarti, Anthony Ferritto, Lin Pan, G P Shrivatsa Bhargav, Dinesh Garg, Avirup Sil

BERT (Bidirectional Encoder Representations from Transformers) and related pre-trained Transformers have provided large gains across many language understanding tasks, achieving a new state-of-the-art (SOTA).

Language Modelling Memorization +4

CFO: A Framework for Building Production NLP Systems

no code implementations IJCNLP 2019 Rishav Chakravarti, Cezar Pendus, Andrzej Sakrajda, Anthony Ferritto, Lin Pan, Michael Glass, Vittorio Castelli, J. William Murdock, Radu Florian, Salim Roukos, Avirup Sil

This paper introduces a novel orchestration framework, called CFO (COMPUTATION FLOW ORCHESTRATOR), for building, experimenting with, and deploying interactive NLP (Natural Language Processing) and IR (Information Retrieval) systems to production environments.

Information Retrieval Machine Reading Comprehension +2

Cannot find the paper you are looking for? You can Submit a new open access paper.