1 code implementation • NAACL (DaSH) 2021 • Rebecca Iglesias-Flores, Megha Mishra, Ajay Patel, Akanksha Malhotra, Reno Kriz, Martha Palmer, Chris Callison-Burch
Acquiring training data for natural language processing systems can be expensive and time-consuming.
1 code implementation • NAACL (Wordplay) 2022 • Ryan Volum, Sudha Rao, Michael Xu, Gabriel DesGarennes, Chris Brockett, Benjamin Van Durme, Olivia Deng, Akanksha Malhotra, Bill Dolan
In this work, we demonstrate that use of a few example conversational prompts can power a conversational agent to generate both natural language and novel code.
no code implementations • 22 May 2023 • ASHISH SHARMA, Sudha Rao, Chris Brockett, Akanksha Malhotra, Nebojsa Jojic, Bill Dolan
While LLMs are being developed to simulate human behavior and serve as human-like agents, little attention has been given to the Agency that these models should possess in order to proactively manage the direction of interaction and collaboration.
no code implementations • ACL 2021 • Piyush Mishra, Akanksha Malhotra, Susan Windisch Brown, Martha Palmer, Ghazaleh Kazeminejad
Much past work has focused on extracting information like events, entities, and relations from documents.
no code implementations • 25 Feb 2021 • Christopher M Ormerod, Akanksha Malhotra, Amir Jafari
Automated Essay Scoring (AES) is a cross-disciplinary effort involving Education, Linguistics, and Natural Language Processing (NLP).
no code implementations • 23 Jan 2021 • Akanksha Malhotra, Sudhir Kamle
ARTH is a self-learning set of algorithms that is an intelligent way of fulfilling the need for "reading and understanding the text effortlessly" which adjusts according to the needs of every user.
no code implementations • WS 2020 • Sarah Beemer, Zak Boston, April Bukoski, Daniel Chen, Princess Dickens, Andrew Gerlach, Torin Hopkins, an, Parth Jawale, Chris Koski, Akanksha Malhotra, Piyush Mishra, Saliha Muradoglu, Lan Sang, Tyler Short, Sagarika Shreevastava, Elizabeth Spaulding, Testumichi Umada, Beilei Xiang, Changbing Yang, Mans Hulden
Sequence-to-sequence models have proven to be highly successful in learning morphological inflection from examples as the series of SIGMORPHON/CoNLL shared tasks have shown.