Papers
arxiv:2410.12952

Facilitating Multi-turn Function Calling for LLMs via Compositional Instruction Tuning

Published on Oct 16, 2024
Authors:
,
,
,
,
,
,
,
,

Abstract

Large Language Models (LLMs) have exhibited significant potential in performing diverse tasks, including the ability to call functions or use external <PRE_TAG>tools</POST_TAG> to enhance their performance. While current research on function calling by LLMs primarily focuses on single-turn interactions, this paper addresses the overlooked necessity for LLMs to engage in multi-turn function calling--critical for handling compositional, real-world queries that require planning with functions but not only use functions. To facilitate this, we introduce an approach, BUTTON, which generates synthetic compositional instruction tuning data via bottom-up instruction construction and top-down trajectory generation. In the bottom-up phase, we generate simple atomic tasks based on real-world scenarios and build compositional tasks using heuristic strategies based on atomic tasks. Corresponding functions are then developed for these compositional tasks. The top-down phase features a multi-agent environment where interactions among simulated humans, assistants, and tools are utilized to gather <PRE_TAG>multi-turn <PRE_TAG>function calling</POST_TAG> trajectories</POST_TAG>. This approach ensures task compositionality and allows for effective function and trajectory generation by examining atomic tasks within compositional tasks. We produce a dataset <PRE_TAG>BUTTONInstruct</POST_TAG> comprising 8k data points and demonstrate its effectiveness through extensive experiments across various LLMs.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2410.12952 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2410.12952 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2410.12952 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.