OptionalconversationStore?: ConversationStorePluggable conversation store — opt-in, stateless by default
OptionalmaxRetries?: numberMaximum retry attempts (default: 2)
OptionalmaxSteps?: numberMaximum tool-loop iterations when stopWhen is not specified (default: 10)
OptionalonStepFinish?: GenerateTextOnStepFinishCallback<ToolSet>Callback after each step
Optionaloutput?: anyStructured output specification
OptionalprepareStep?: PrepareStepFunction<ToolSet>Customize each step before execution
Prompt file name (e.g. 'my_agent@v1')
OptionalpromptDir?: stringOverride the stack-resolved prompt directory
Optionalseed?: numberRandom seed for deterministic output
Optionalskills?: Skill[]Static skill packages made available to the LLM
OptionalstopWhen?: anyCustom stop condition(s) — overrides maxSteps
Optionaltemperature?: numberGeneration temperature (overrides prompt file value)
Optionaltools?: ToolSetAI SDK tools available during the reasoning loop
OptionaltopK?: numberTop-k sampling
OptionaltopP?: numberTop-p sampling
Optionalvariables?: Record<string, unknown>Variables to render the prompt template at construction time
Run the agent and return when complete.
Same augmented shape as generateText: result, optional cost, merged sources.
Optionaloptions: {Stream the agent's response.
onFinish receives WrappedStreamTextOnFinishEvent (cost optional), matching streamText.
Optionaloptions: OutputAgentStreamParameters
Agent extends AI SDK's ToolLoopAgent with Output.ai prompt file rendering and the skill system.
Example: Workflow step — variables per call, stateless
Example: Interactive — fixed setup, conversation history