Search Results for author: Raj Sanjay Shah

Found 7 papers, 0 papers with code

Multi-Level Feedback Generation with Large Language Models for Empowering Novice Peer Counselors

no code implementations21 Mar 2024 Alicja Chaszczewicz, Raj Sanjay Shah, Ryan Louie, Bruce A Arnow, Robert Kraut, Diyi Yang

We further design a self-improvement method on top of large language models to enhance the automatic generation of feedback.

Catastrophic Interference is Mitigated in Naturalistic Power-Law Learning Environments

no code implementations18 Jan 2024 Atith Gandhi, Raj Sanjay Shah, Vijay Marupudi, Sashank Varma

In addition, because our method is orthogonal to other methods, future research can combine training in power-law environments with other continual learning mechanisms.

Continual Learning

Pre-training LLMs using human-like development data corpus

no code implementations8 Nov 2023 Khushi Bhardwaj, Raj Sanjay Shah, Sashank Varma

Pre-trained Large Language Models (LLMs) have shown success in a diverse set of language inference and understanding tasks.

Language Acquisition

Human Behavioral Benchmarking: Numeric Magnitude Comparison Effects in Large Language Models

no code implementations18 May 2023 Raj Sanjay Shah, Vijay Marupudi, Reba Koenen, Khushi Bhardwaj, Sashank Varma

This research shows the utility of understanding LLMs using behavioral benchmarks and points the way to future work on the number representations of LLMs and their cognitive plausibility.

Benchmarking

Helping the Helper: Supporting Peer Counselors via AI-Empowered Practice and Feedback

no code implementations15 May 2023 Shang-Ling Hsu, Raj Sanjay Shah, Prathik Senthil, Zahra Ashktorab, Casey Dugan, Werner Geyer, Diyi Yang

Millions of users come to online peer counseling platforms to seek support on diverse topics ranging from relationship stress to anxiety.

Text Generation

Modeling Motivational Interviewing Strategies On An Online Peer-to-Peer Counseling Platform

no code implementations9 Nov 2022 Raj Sanjay Shah, Faye Holt, Shirley Anugrah Hayati, Aastha Agarwal, Yi-Chia Wang, Robert E. Kraut, Diyi Yang

This work provides a deeper understanding of the use of motivational interviewing techniques on peer-to-peer counselor platforms and sheds light on how to build better training programs for volunteer counselors on online platforms.

WHEN FLUE MEETS FLANG: Benchmarks and Large Pre-trained Language Model for Financial Domain

no code implementations31 Oct 2022 Raj Sanjay Shah, Kunal Chawla, Dheeraj Eidnani, Agam Shah, Wendi Du, Sudheer Chava, Natraj Raman, Charese Smiley, Jiaao Chen, Diyi Yang

To this end, we contribute the Financial Language Understanding Evaluation (FLUE), an open-source comprehensive suite of benchmarks for the financial domain.

FLUE Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.