Back to glossaryGLOSSARY · Concepts

Tool Use

When an LLM calls external functions or APIs as part of its response. The mechanism that turns an LLM from a text generator into an agent that can act. Also called function calling.

How it works

The LLM is shown a list of available tools with their schemas (input/output types) and descriptions. When the LLM decides to call a tool, it emits structured output naming the tool and its arguments. The runtime executes the tool and feeds the result back as a new message in the conversation.

Example

A customer service agent has tools: get_order_status, process_refund, escalate_to_human. The LLM reads a customer message, picks the right tool, the runtime executes, the result feeds back, the LLM either takes another action or responds.

Related terms

Need to actually use Tool Use?

We build production AI systems that put these concepts to work. 30 minutes, we map your use case.