Search Results for author: Shishir G. Patil

Found 6 papers, 6 papers with code

GoEX: Perspectives and Designs Towards a Runtime for Autonomous LLM Applications

1 code implementation10 Apr 2024 Shishir G. Patil, Tianjun Zhang, Vivian Fang, Noppapon C., Roy Huang, Aaron Hao, Martin Casado, Joseph E. Gonzalez, Raluca Ada Popa, Ion Stoica

We believe this is critical to unlock the potential for LLM agents to interact with applications and services with limited (post-facto) human involvement.

RAFT: Adapting Language Model to Domain Specific RAG

1 code implementation15 Mar 2024 Tianjun Zhang, Shishir G. Patil, Naman jain, Sheng Shen, Matei Zaharia, Ion Stoica, Joseph E. Gonzalez

In this paper, we present Retrieval Augmented FineTuning (RAFT), a training recipe that improves the model's ability to answer questions in a "open-book" in-domain settings.

Language Modelling

MemGPT: Towards LLMs as Operating Systems

1 code implementation12 Oct 2023 Charles Packer, Sarah Wooders, Kevin Lin, Vivian Fang, Shishir G. Patil, Ion Stoica, Joseph E. Gonzalez

Large language models (LLMs) have revolutionized AI, but are constrained by limited context windows, hindering their utility in tasks like extended conversations and document analysis.

Management

Gorilla: Large Language Model Connected with Massive APIs

1 code implementation24 May 2023 Shishir G. Patil, Tianjun Zhang, Xin Wang, Joseph E. Gonzalez

Large Language Models (LLMs) have seen an impressive wave of advances recently, with models now excelling in a variety of tasks, such as mathematical reasoning and program synthesis.

Hallucination Language Modelling +4

POET: Training Neural Networks on Tiny Devices with Integrated Rematerialization and Paging

1 code implementation15 Jul 2022 Shishir G. Patil, Paras Jain, Prabal Dutta, Ion Stoica, Joseph E. Gonzalez

We demonstrate that it is possible to fine-tune both ResNet-18 and BERT within the memory constraints of a Cortex-M class embedded device while outperforming current edge training methods in energy efficiency.

Privacy Preserving

Cannot find the paper you are looking for? You can Submit a new open access paper.