Search Results for author: Jianqiao Lu

Found 4 papers, 1 papers with code

Planning, Creation, Usage: Benchmarking LLMs for Comprehensive Tool Utilization in Real-World Complex Scenarios

1 code implementation30 Jan 2024 Shijue Huang, Wanjun Zhong, Jianqiao Lu, Qi Zhu, Jiahui Gao, Weiwen Liu, Yutai Hou, Xingshan Zeng, Yasheng Wang, Lifeng Shang, Xin Jiang, Ruifeng Xu, Qun Liu

The recent trend of using Large Language Models (LLMs) as tool agents in real-world applications underscores the necessity for comprehensive evaluations of their capabilities, particularly in complex scenarios involving planning, creating, and using tools.

Benchmarking

YODA: Teacher-Student Progressive Learning for Language Models

no code implementations28 Jan 2024 Jianqiao Lu, Wanjun Zhong, YuFei Wang, Zhijiang Guo, Qi Zhu, Wenyong Huang, Yanlin Wang, Fei Mi, Baojun Wang, Yasheng Wang, Lifeng Shang, Xin Jiang, Qun Liu

With the teacher's guidance, the student learns to iteratively refine its answer with feedback, and forms a robust and comprehensive understanding of the posed questions.

GSM8K Math

Improving End-to-End Speech Processing by Efficient Text Data Utilization with Latent Synthesis

no code implementations9 Oct 2023 Jianqiao Lu, Wenyong Huang, Nianzu Zheng, Xingshan Zeng, Yu Ting Yeung, Xiao Chen

For SLU, LaSyn improves our E2E baseline by absolute 4. 1% for intent classification accuracy and 3. 8% for slot filling SLU-F1 on SLURP, and absolute 4. 49% and 2. 25% for exact match (EM) and EM-Tree accuracies on STOP respectively.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +6

SELF: Self-Evolution with Language Feedback

no code implementations1 Oct 2023 Jianqiao Lu, Wanjun Zhong, Wenyong Huang, YuFei Wang, Qi Zhu, Fei Mi, Baojun Wang, Weichao Wang, Xingshan Zeng, Lifeng Shang, Xin Jiang, Qun Liu

SELF initiates with a meta-skill learning process that equips the LLMs with capabilities for self-feedback and self-refinement.

Language Modelling Large Language Model

Cannot find the paper you are looking for? You can Submit a new open access paper.